Apr 17 16:52:34.975191 ip-10-0-134-88 systemd[1]: Starting Kubernetes Kubelet... Apr 17 16:52:35.399021 ip-10-0-134-88 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:52:35.399021 ip-10-0-134-88 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 16:52:35.399021 ip-10-0-134-88 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:52:35.399021 ip-10-0-134-88 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 16:52:35.399021 ip-10-0-134-88 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:52:35.399676 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.399590 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 16:52:35.401884 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401869 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:52:35.401936 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401885 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:52:35.401936 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401889 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:52:35.401936 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401892 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:52:35.401936 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401896 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:52:35.401936 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401900 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:52:35.401936 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401902 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:52:35.401936 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401906 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:52:35.401936 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401909 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:52:35.401936 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401911 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:52:35.401936 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401914 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:52:35.401936 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401931 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:52:35.401936 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401933 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:52:35.401936 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401936 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:52:35.401936 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401939 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:52:35.401936 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401942 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:52:35.401936 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401945 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:52:35.402316 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401948 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:52:35.402316 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401950 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:52:35.402316 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401953 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:52:35.402316 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401956 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:52:35.402316 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401963 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:52:35.402316 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401967 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:52:35.402316 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401971 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:52:35.402316 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401974 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:52:35.402316 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401978 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:52:35.402316 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401980 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:52:35.402316 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401983 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:52:35.402316 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401986 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:52:35.402316 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401989 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:52:35.402316 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401991 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:52:35.402316 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401994 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:52:35.402316 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.401997 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:52:35.402316 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402000 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:52:35.402316 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402002 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:52:35.402316 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402005 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:52:35.402316 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402007 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:52:35.402810 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402010 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:52:35.402810 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402012 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:52:35.402810 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402015 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:52:35.402810 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402018 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:52:35.402810 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402020 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:52:35.402810 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402022 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:52:35.402810 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402025 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:52:35.402810 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402027 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:52:35.402810 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402031 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:52:35.402810 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402035 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:52:35.402810 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402038 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:52:35.402810 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402040 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:52:35.402810 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402043 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:52:35.402810 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402046 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:52:35.402810 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402049 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:52:35.402810 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402052 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:52:35.402810 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402055 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:52:35.402810 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402058 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:52:35.402810 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402060 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:52:35.403281 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402063 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:52:35.403281 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402065 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:52:35.403281 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402068 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:52:35.403281 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402071 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:52:35.403281 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402073 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:52:35.403281 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402075 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:52:35.403281 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402078 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:52:35.403281 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402081 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:52:35.403281 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402083 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:52:35.403281 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402086 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:52:35.403281 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402089 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:52:35.403281 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402092 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:52:35.403281 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402095 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:52:35.403281 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402097 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:52:35.403281 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402100 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:52:35.403281 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402103 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:52:35.403281 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402106 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:52:35.403281 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402108 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:52:35.403281 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402111 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:52:35.403281 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402113 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:52:35.403760 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402116 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:52:35.403760 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402118 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:52:35.403760 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402121 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:52:35.403760 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402123 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:52:35.403760 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402126 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:52:35.403760 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402129 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:52:35.403760 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402132 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:52:35.403760 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402135 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:52:35.403760 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402138 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:52:35.403760 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402141 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:52:35.403760 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402683 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:52:35.403760 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402689 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:52:35.403760 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402692 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:52:35.403760 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402695 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:52:35.403760 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402699 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:52:35.403760 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402701 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:52:35.403760 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402704 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:52:35.403760 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402707 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:52:35.403760 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402710 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:52:35.403760 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402713 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:52:35.404259 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402716 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:52:35.404259 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402718 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:52:35.404259 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402721 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:52:35.404259 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402724 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:52:35.404259 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402727 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:52:35.404259 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402729 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:52:35.404259 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402732 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:52:35.404259 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402735 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:52:35.404259 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402738 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:52:35.404259 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402740 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:52:35.404259 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402743 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:52:35.404259 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402745 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:52:35.404259 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402748 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:52:35.404259 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402750 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:52:35.404259 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402753 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:52:35.404259 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402755 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:52:35.404259 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402759 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:52:35.404259 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402761 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:52:35.404259 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402765 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:52:35.404259 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402767 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:52:35.404752 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402770 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:52:35.404752 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402773 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:52:35.404752 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402775 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:52:35.404752 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402778 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:52:35.404752 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402780 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:52:35.404752 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402782 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:52:35.404752 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402785 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:52:35.404752 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402787 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:52:35.404752 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402790 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:52:35.404752 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402794 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:52:35.404752 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402797 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:52:35.404752 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402801 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:52:35.404752 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402805 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:52:35.404752 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402808 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:52:35.404752 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402810 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:52:35.404752 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402814 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:52:35.404752 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402816 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:52:35.404752 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402819 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:52:35.404752 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402822 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:52:35.404752 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402825 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:52:35.405248 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402828 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:52:35.405248 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402831 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:52:35.405248 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402833 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:52:35.405248 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402835 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:52:35.405248 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402838 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:52:35.405248 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402840 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:52:35.405248 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402843 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:52:35.405248 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402845 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:52:35.405248 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402848 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:52:35.405248 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402851 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:52:35.405248 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402853 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:52:35.405248 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402856 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:52:35.405248 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402859 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:52:35.405248 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402861 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:52:35.405248 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402863 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:52:35.405248 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402866 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:52:35.405248 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402868 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:52:35.405248 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402871 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:52:35.405248 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402875 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:52:35.405248 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402877 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:52:35.405785 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402880 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:52:35.405785 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402882 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:52:35.405785 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402884 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:52:35.405785 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402887 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:52:35.405785 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402889 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:52:35.405785 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402892 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:52:35.405785 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402894 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:52:35.405785 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402897 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:52:35.405785 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402899 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:52:35.405785 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402902 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:52:35.405785 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402904 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:52:35.405785 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402907 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:52:35.405785 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402910 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:52:35.405785 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402912 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:52:35.405785 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402915 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:52:35.405785 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.402932 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:52:35.405785 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403005 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 16:52:35.405785 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403012 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 16:52:35.405785 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403018 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 16:52:35.405785 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403023 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403027 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403030 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403035 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403040 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403043 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403046 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403050 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403053 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403056 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403059 2568 flags.go:64] FLAG: --cgroup-root="" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403062 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403064 2568 flags.go:64] FLAG: --client-ca-file="" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403067 2568 flags.go:64] FLAG: --cloud-config="" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403071 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403074 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403078 2568 flags.go:64] FLAG: --cluster-domain="" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403081 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403084 2568 flags.go:64] FLAG: --config-dir="" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403087 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403090 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403094 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403097 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403101 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 16:52:35.406359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403104 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403107 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403110 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403113 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403116 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403119 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403124 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403127 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403130 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403133 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403136 2568 flags.go:64] FLAG: --enable-server="true" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403140 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403145 2568 flags.go:64] FLAG: --event-burst="100" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403148 2568 flags.go:64] FLAG: --event-qps="50" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403151 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403154 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403157 2568 flags.go:64] FLAG: --eviction-hard="" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403161 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403164 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403167 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403170 2568 flags.go:64] FLAG: --eviction-soft="" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403173 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403176 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403179 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403181 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 16:52:35.407030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403184 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403187 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403190 2568 flags.go:64] FLAG: --feature-gates="" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403194 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403197 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403200 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403203 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403206 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403209 2568 flags.go:64] FLAG: --help="false" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403211 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-134-88.ec2.internal" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403215 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403218 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403220 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403224 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403227 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403230 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403233 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403236 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403239 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403242 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403245 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403248 2568 flags.go:64] FLAG: --kube-reserved="" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403251 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403254 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 16:52:35.407739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403257 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403260 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403263 2568 flags.go:64] FLAG: --lock-file="" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403266 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403269 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403272 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403277 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403280 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403283 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403286 2568 flags.go:64] FLAG: --logging-format="text" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403289 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403292 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403295 2568 flags.go:64] FLAG: --manifest-url="" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403298 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403302 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403306 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403310 2568 flags.go:64] FLAG: --max-pods="110" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403313 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403316 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403319 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403322 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403325 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403328 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403331 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403338 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 16:52:35.408498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403341 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403344 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403347 2568 flags.go:64] FLAG: --pod-cidr="" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403350 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403356 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403360 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403363 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403366 2568 flags.go:64] FLAG: --port="10250" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403369 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403372 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0d6787be42c0b6f76" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403375 2568 flags.go:64] FLAG: --qos-reserved="" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403378 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403381 2568 flags.go:64] FLAG: --register-node="true" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403384 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403387 2568 flags.go:64] FLAG: --register-with-taints="" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403391 2568 flags.go:64] FLAG: --registry-burst="10" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403393 2568 flags.go:64] FLAG: --registry-qps="5" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403396 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403399 2568 flags.go:64] FLAG: --reserved-memory="" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403402 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403405 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403408 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403411 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403414 2568 flags.go:64] FLAG: --runonce="false" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403416 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 16:52:35.409234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403420 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403423 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403427 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403430 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403433 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403436 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403440 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403443 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403446 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403448 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403451 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403455 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403458 2568 flags.go:64] FLAG: --system-cgroups="" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403460 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403466 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403468 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403471 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403476 2568 flags.go:64] FLAG: --tls-min-version="" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403479 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403481 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403484 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403487 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403490 2568 flags.go:64] FLAG: --v="2" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403494 2568 flags.go:64] FLAG: --version="false" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403498 2568 flags.go:64] FLAG: --vmodule="" Apr 17 16:52:35.409882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403502 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 16:52:35.410509 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.403505 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 16:52:35.410509 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403596 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:52:35.410509 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403600 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:52:35.410509 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403603 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:52:35.410509 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403606 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:52:35.410509 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403609 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:52:35.410509 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403612 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:52:35.410509 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403614 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:52:35.410509 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403617 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:52:35.410509 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403620 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:52:35.410509 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403623 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:52:35.410509 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403625 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:52:35.410509 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403628 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:52:35.410509 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403631 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:52:35.410509 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403634 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:52:35.410509 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403636 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:52:35.410509 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403639 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:52:35.410509 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403643 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:52:35.410509 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403647 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:52:35.410972 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403649 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:52:35.410972 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403652 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:52:35.410972 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403654 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:52:35.410972 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403657 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:52:35.410972 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403659 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:52:35.410972 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403662 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:52:35.410972 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403666 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:52:35.410972 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403669 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:52:35.410972 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403672 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:52:35.410972 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403675 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:52:35.410972 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403678 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:52:35.410972 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403681 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:52:35.410972 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403683 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:52:35.410972 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403686 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:52:35.410972 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403689 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:52:35.410972 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403692 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:52:35.410972 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403694 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:52:35.410972 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403696 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:52:35.410972 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403699 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:52:35.411456 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403701 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:52:35.411456 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403704 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:52:35.411456 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403706 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:52:35.411456 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403709 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:52:35.411456 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403712 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:52:35.411456 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403714 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:52:35.411456 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403719 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:52:35.411456 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403725 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:52:35.411456 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403727 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:52:35.411456 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403730 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:52:35.411456 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403733 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:52:35.411456 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403735 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:52:35.411456 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403738 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:52:35.411456 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403741 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:52:35.411456 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403743 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:52:35.411456 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403746 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:52:35.411456 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403749 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:52:35.411456 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403751 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:52:35.411456 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403754 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:52:35.411456 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403756 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:52:35.411963 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403759 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:52:35.411963 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403761 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:52:35.411963 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403764 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:52:35.411963 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403767 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:52:35.411963 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403770 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:52:35.411963 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403773 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:52:35.411963 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403775 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:52:35.411963 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403778 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:52:35.411963 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403780 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:52:35.411963 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403783 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:52:35.411963 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403786 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:52:35.411963 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403788 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:52:35.411963 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403791 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:52:35.411963 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403793 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:52:35.411963 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403796 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:52:35.411963 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403798 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:52:35.411963 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403801 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:52:35.411963 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403804 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:52:35.411963 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403808 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:52:35.411963 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403812 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:52:35.412503 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403815 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:52:35.412503 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403817 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:52:35.412503 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403820 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:52:35.412503 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403823 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:52:35.412503 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403825 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:52:35.412503 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403828 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:52:35.412503 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403830 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:52:35.412503 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403833 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:52:35.412503 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.403835 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:52:35.412503 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.404784 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:52:35.412503 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.411452 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 16:52:35.412503 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.411559 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 16:52:35.412503 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411621 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:52:35.412503 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411626 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:52:35.412503 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411631 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:52:35.412883 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411636 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:52:35.412883 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411639 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:52:35.412883 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411642 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:52:35.412883 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411645 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:52:35.412883 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411648 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:52:35.412883 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411651 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:52:35.412883 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411654 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:52:35.412883 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411657 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:52:35.412883 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411660 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:52:35.412883 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411663 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:52:35.412883 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411665 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:52:35.412883 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411668 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:52:35.412883 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411671 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:52:35.412883 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411673 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:52:35.412883 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411676 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:52:35.412883 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411679 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:52:35.412883 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411681 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:52:35.412883 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411685 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:52:35.412883 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411688 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:52:35.413365 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411692 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:52:35.413365 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411695 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:52:35.413365 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411698 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:52:35.413365 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411700 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:52:35.413365 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411703 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:52:35.413365 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411705 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:52:35.413365 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411708 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:52:35.413365 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411711 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:52:35.413365 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411713 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:52:35.413365 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411718 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:52:35.413365 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411721 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:52:35.413365 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411724 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:52:35.413365 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411726 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:52:35.413365 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411729 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:52:35.413365 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411731 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:52:35.413365 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411734 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:52:35.413365 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411736 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:52:35.413365 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411738 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:52:35.413365 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411741 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:52:35.413365 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411743 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:52:35.413847 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411746 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:52:35.413847 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411748 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:52:35.413847 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411751 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:52:35.413847 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411753 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:52:35.413847 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411756 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:52:35.413847 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411758 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:52:35.413847 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411761 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:52:35.413847 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411763 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:52:35.413847 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411766 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:52:35.413847 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411768 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:52:35.413847 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411771 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:52:35.413847 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411774 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:52:35.413847 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411776 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:52:35.413847 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411779 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:52:35.413847 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411781 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:52:35.413847 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411784 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:52:35.413847 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411787 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:52:35.413847 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411789 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:52:35.413847 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411791 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:52:35.413847 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411794 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:52:35.414353 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411796 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:52:35.414353 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411799 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:52:35.414353 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411802 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:52:35.414353 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411804 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:52:35.414353 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411807 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:52:35.414353 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411809 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:52:35.414353 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411812 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:52:35.414353 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411814 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:52:35.414353 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411817 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:52:35.414353 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411819 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:52:35.414353 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411822 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:52:35.414353 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411824 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:52:35.414353 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411827 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:52:35.414353 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411829 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:52:35.414353 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411832 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:52:35.414353 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411834 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:52:35.414353 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411837 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:52:35.414353 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411839 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:52:35.414353 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411842 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:52:35.414353 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411844 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:52:35.414837 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411847 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:52:35.414837 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411849 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:52:35.414837 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411852 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:52:35.414837 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411855 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:52:35.414837 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.411859 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:52:35.414837 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411971 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:52:35.414837 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411977 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:52:35.414837 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411980 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:52:35.414837 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411983 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:52:35.414837 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411985 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:52:35.414837 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411988 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:52:35.414837 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411991 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:52:35.414837 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411993 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:52:35.414837 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411996 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:52:35.414837 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.411998 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:52:35.414837 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412002 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:52:35.415272 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412005 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:52:35.415272 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412008 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:52:35.415272 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412010 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:52:35.415272 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412013 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:52:35.415272 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412015 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:52:35.415272 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412018 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:52:35.415272 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412020 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:52:35.415272 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412023 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:52:35.415272 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412025 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:52:35.415272 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412027 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:52:35.415272 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412030 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:52:35.415272 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412032 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:52:35.415272 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412035 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:52:35.415272 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412037 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:52:35.415272 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412040 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:52:35.415272 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412042 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:52:35.415272 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412046 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:52:35.415272 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412048 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:52:35.415272 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412051 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:52:35.415736 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412054 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:52:35.415736 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412056 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:52:35.415736 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412059 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:52:35.415736 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412061 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:52:35.415736 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412064 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:52:35.415736 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412066 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:52:35.415736 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412069 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:52:35.415736 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412071 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:52:35.415736 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412074 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:52:35.415736 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412076 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:52:35.415736 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412079 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:52:35.415736 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412081 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:52:35.415736 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412084 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:52:35.415736 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412087 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:52:35.415736 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412089 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:52:35.415736 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412091 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:52:35.415736 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412094 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:52:35.415736 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412096 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:52:35.415736 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412099 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:52:35.415736 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412101 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:52:35.416235 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412104 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:52:35.416235 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412106 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:52:35.416235 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412108 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:52:35.416235 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412111 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:52:35.416235 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412113 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:52:35.416235 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412116 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:52:35.416235 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412118 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:52:35.416235 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412121 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:52:35.416235 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412124 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:52:35.416235 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412127 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:52:35.416235 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412131 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:52:35.416235 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412133 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:52:35.416235 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412137 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:52:35.416235 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412139 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:52:35.416235 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412142 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:52:35.416235 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412145 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:52:35.416235 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412147 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:52:35.416235 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412150 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:52:35.416235 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412152 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:52:35.416235 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412155 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:52:35.416745 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412157 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:52:35.416745 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412160 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:52:35.416745 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412162 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:52:35.416745 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412164 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:52:35.416745 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412167 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:52:35.416745 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412170 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:52:35.416745 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412172 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:52:35.416745 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412175 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:52:35.416745 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412177 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:52:35.416745 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412180 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:52:35.416745 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412182 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:52:35.416745 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412186 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:52:35.416745 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412189 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:52:35.416745 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412192 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:52:35.416745 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412194 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:52:35.416745 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:35.412196 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:52:35.417143 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.412201 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:52:35.417143 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.412832 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 16:52:35.417143 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.414727 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 16:52:35.417143 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.415644 2568 server.go:1019] "Starting client certificate rotation" Apr 17 16:52:35.417143 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.415738 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:52:35.417143 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.415796 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:52:35.438693 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.438676 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:52:35.442876 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.442854 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:52:35.455008 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.454989 2568 log.go:25] "Validated CRI v1 runtime API" Apr 17 16:52:35.463764 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.463747 2568 log.go:25] "Validated CRI v1 image API" Apr 17 16:52:35.464990 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.464968 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 16:52:35.466757 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.466741 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:52:35.468676 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.468649 2568 fs.go:135] Filesystem UUIDs: map[250f6fda-6393-4f86-be12-710dd2006cfa:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 930fd235-db33-412c-a7ba-d169c5cd49c6:/dev/nvme0n1p3] Apr 17 16:52:35.468746 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.468674 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 16:52:35.474181 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.474068 2568 manager.go:217] Machine: {Timestamp:2026-04-17 16:52:35.472307844 +0000 UTC m=+0.379192651 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101460 MemoryCapacity:33164500992 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2f0634e736391d61cc9478a5040b5e SystemUUID:ec2f0634-e736-391d-61cc-9478a5040b5e BootID:765d6a07-8a06-4b75-9f96-b53fea8cd4a7 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582250496 Type:vfs Inodes:4048401 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:f1:97:b3:89:17 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:f1:97:b3:89:17 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:52:4c:ff:3e:e2:65 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164500992 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 16:52:35.474690 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.474678 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 16:52:35.474812 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.474796 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 16:52:35.475726 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.475697 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 16:52:35.475898 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.475726 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-88.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 16:52:35.476002 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.475911 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 16:52:35.476002 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.475940 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 16:52:35.476002 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.475959 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:52:35.477314 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.477301 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:52:35.478506 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.478494 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:52:35.478626 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.478616 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 16:52:35.480820 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.480809 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 17 16:52:35.480900 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.480826 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 16:52:35.480900 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.480848 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 16:52:35.480900 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.480863 2568 kubelet.go:397] "Adding apiserver pod source" Apr 17 16:52:35.480900 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.480875 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 16:52:35.482126 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.482113 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:52:35.482198 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.482136 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:52:35.484818 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.484802 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 16:52:35.486122 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.486110 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 16:52:35.487945 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.487915 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 16:52:35.488001 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.487951 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 16:52:35.488001 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.487961 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 16:52:35.488001 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.487994 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 16:52:35.488283 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.488004 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 16:52:35.488283 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.488014 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 16:52:35.488283 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.488023 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 16:52:35.488283 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.488032 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 16:52:35.488283 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.488041 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 16:52:35.488283 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.488051 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 16:52:35.488283 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.488081 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 16:52:35.488283 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.488094 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 16:52:35.488947 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.488915 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 16:52:35.488947 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.488947 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 16:52:35.490172 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.490146 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dhvxp" Apr 17 16:52:35.492909 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.492877 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-88.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 16:52:35.493471 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.493454 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 16:52:35.493513 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.493497 2568 server.go:1295] "Started kubelet" Apr 17 16:52:35.493684 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:35.493647 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-88.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 16:52:35.494415 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.494383 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 16:52:35.494847 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:35.493654 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 16:52:35.494847 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.494428 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 16:52:35.494980 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.494872 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 16:52:35.495074 ip-10-0-134-88 systemd[1]: Started Kubernetes Kubelet. Apr 17 16:52:35.496258 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.496244 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 16:52:35.496854 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.496834 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 17 16:52:35.497732 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.497717 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dhvxp" Apr 17 16:52:35.499649 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:35.498507 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-88.ec2.internal.18a73317d64b3ca6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-88.ec2.internal,UID:ip-10-0-134-88.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-88.ec2.internal,},FirstTimestamp:2026-04-17 16:52:35.49346935 +0000 UTC m=+0.400354157,LastTimestamp:2026-04-17 16:52:35.49346935 +0000 UTC m=+0.400354157,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-88.ec2.internal,}" Apr 17 16:52:35.500827 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.500809 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 16:52:35.500910 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.500854 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 16:52:35.501384 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.501362 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 16:52:35.501466 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.501391 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 16:52:35.501466 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.501410 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 16:52:35.501466 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:35.501452 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 16:52:35.501598 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.501537 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 17 16:52:35.501598 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.501546 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 17 16:52:35.501685 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:35.501636 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-88.ec2.internal\" not found" Apr 17 16:52:35.501685 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.501650 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 16:52:35.501685 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.501661 2568 factory.go:55] Registering systemd factory Apr 17 16:52:35.501685 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.501670 2568 factory.go:223] Registration of the systemd container factory successfully Apr 17 16:52:35.501857 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.501821 2568 factory.go:153] Registering CRI-O factory Apr 17 16:52:35.501857 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.501828 2568 factory.go:223] Registration of the crio container factory successfully Apr 17 16:52:35.501857 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.501844 2568 factory.go:103] Registering Raw factory Apr 17 16:52:35.501857 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.501854 2568 manager.go:1196] Started watching for new ooms in manager Apr 17 16:52:35.502301 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.502286 2568 manager.go:319] Starting recovery of all containers Apr 17 16:52:35.510858 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.510835 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:52:35.513024 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.513007 2568 manager.go:324] Recovery completed Apr 17 16:52:35.514093 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:35.514045 2568 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-134-88.ec2.internal\" not found" node="ip-10-0-134-88.ec2.internal" Apr 17 16:52:35.515676 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:35.515654 2568 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 17 16:52:35.518811 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.518798 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:52:35.521045 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.521030 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-88.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:52:35.521114 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.521057 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:52:35.521114 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.521070 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-88.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:52:35.521522 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.521509 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 16:52:35.521522 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.521520 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 16:52:35.521606 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.521536 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:52:35.524368 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.524356 2568 policy_none.go:49] "None policy: Start" Apr 17 16:52:35.524429 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.524371 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 16:52:35.524429 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.524381 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 17 16:52:35.561515 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.561499 2568 manager.go:341] "Starting Device Plugin manager" Apr 17 16:52:35.580889 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:35.561558 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 16:52:35.580889 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.561568 2568 server.go:85] "Starting device plugin registration server" Apr 17 16:52:35.580889 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.561827 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 16:52:35.580889 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.561845 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 16:52:35.580889 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.561953 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 16:52:35.580889 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.562026 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 16:52:35.580889 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.562033 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 16:52:35.580889 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:35.562771 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 16:52:35.580889 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:35.562801 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-88.ec2.internal\" not found" Apr 17 16:52:35.632295 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.632259 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 16:52:35.633426 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.633408 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 16:52:35.633486 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.633441 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 16:52:35.633486 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.633465 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 16:52:35.633486 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.633474 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 16:52:35.633601 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:35.633557 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 16:52:35.636042 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.636025 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:52:35.662288 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.662242 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:52:35.663221 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.663201 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-88.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:52:35.663309 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.663232 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:52:35.663309 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.663244 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-88.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:52:35.663309 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.663265 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-88.ec2.internal" Apr 17 16:52:35.671497 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.671482 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-88.ec2.internal" Apr 17 16:52:35.671541 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:35.671502 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-88.ec2.internal\": node \"ip-10-0-134-88.ec2.internal\" not found" Apr 17 16:52:35.703481 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:35.703459 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-88.ec2.internal\" not found" Apr 17 16:52:35.734509 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.734468 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-88.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-88.ec2.internal"] Apr 17 16:52:35.734620 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.734547 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:52:35.735397 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.735375 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-88.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:52:35.735455 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.735401 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:52:35.735455 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.735415 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-88.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:52:35.737702 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.737691 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:52:35.737887 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.737839 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-88.ec2.internal" Apr 17 16:52:35.737956 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.737904 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:52:35.738329 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.738314 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-88.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:52:35.738389 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.738342 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:52:35.738389 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.738356 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-88.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:52:35.738389 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.738384 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-88.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:52:35.738501 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.738407 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:52:35.738501 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.738418 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-88.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:52:35.740539 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.740526 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-88.ec2.internal" Apr 17 16:52:35.740604 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.740549 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:52:35.741132 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.741118 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-88.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:52:35.741206 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.741145 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:52:35.741206 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.741157 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-88.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:52:35.763479 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:35.763457 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-88.ec2.internal\" not found" node="ip-10-0-134-88.ec2.internal" Apr 17 16:52:35.767952 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:35.767937 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-88.ec2.internal\" not found" node="ip-10-0-134-88.ec2.internal" Apr 17 16:52:35.802630 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.802610 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b1ba083eb7d02ba302027e7be390bc5c-config\") pod \"kube-apiserver-proxy-ip-10-0-134-88.ec2.internal\" (UID: \"b1ba083eb7d02ba302027e7be390bc5c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-88.ec2.internal" Apr 17 16:52:35.802717 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.802638 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d2e5fe7b3b1a217e75d5333a6a0908b9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-88.ec2.internal\" (UID: \"d2e5fe7b3b1a217e75d5333a6a0908b9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-88.ec2.internal" Apr 17 16:52:35.802717 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.802665 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d2e5fe7b3b1a217e75d5333a6a0908b9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-88.ec2.internal\" (UID: \"d2e5fe7b3b1a217e75d5333a6a0908b9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-88.ec2.internal" Apr 17 16:52:35.803647 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:35.803631 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-88.ec2.internal\" not found" Apr 17 16:52:35.902888 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.902867 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b1ba083eb7d02ba302027e7be390bc5c-config\") pod \"kube-apiserver-proxy-ip-10-0-134-88.ec2.internal\" (UID: \"b1ba083eb7d02ba302027e7be390bc5c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-88.ec2.internal" Apr 17 16:52:35.902977 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.902893 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d2e5fe7b3b1a217e75d5333a6a0908b9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-88.ec2.internal\" (UID: \"d2e5fe7b3b1a217e75d5333a6a0908b9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-88.ec2.internal" Apr 17 16:52:35.902977 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.902908 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d2e5fe7b3b1a217e75d5333a6a0908b9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-88.ec2.internal\" (UID: \"d2e5fe7b3b1a217e75d5333a6a0908b9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-88.ec2.internal" Apr 17 16:52:35.902977 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.902953 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b1ba083eb7d02ba302027e7be390bc5c-config\") pod \"kube-apiserver-proxy-ip-10-0-134-88.ec2.internal\" (UID: \"b1ba083eb7d02ba302027e7be390bc5c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-88.ec2.internal" Apr 17 16:52:35.903067 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.902973 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d2e5fe7b3b1a217e75d5333a6a0908b9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-88.ec2.internal\" (UID: \"d2e5fe7b3b1a217e75d5333a6a0908b9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-88.ec2.internal" Apr 17 16:52:35.903067 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:35.902977 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d2e5fe7b3b1a217e75d5333a6a0908b9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-88.ec2.internal\" (UID: \"d2e5fe7b3b1a217e75d5333a6a0908b9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-88.ec2.internal" Apr 17 16:52:35.903961 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:35.903911 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-88.ec2.internal\" not found" Apr 17 16:52:36.004670 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:36.004608 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-88.ec2.internal\" not found" Apr 17 16:52:36.065817 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:36.065794 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-88.ec2.internal" Apr 17 16:52:36.070283 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:36.070268 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-88.ec2.internal" Apr 17 16:52:36.104916 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:36.104895 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-88.ec2.internal\" not found" Apr 17 16:52:36.205309 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:36.205288 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-88.ec2.internal\" not found" Apr 17 16:52:36.305750 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:36.305689 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-88.ec2.internal\" not found" Apr 17 16:52:36.406115 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:36.406092 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-88.ec2.internal\" not found" Apr 17 16:52:36.416262 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:36.416245 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 16:52:36.416422 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:36.416402 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:52:36.416459 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:36.416428 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:52:36.499946 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:36.499894 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 16:47:35 +0000 UTC" deadline="2027-12-01 16:12:48.752771679 +0000 UTC" Apr 17 16:52:36.499946 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:36.499942 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14231h20m12.25283263s" Apr 17 16:52:36.501460 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:36.501437 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 16:52:36.506972 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:36.506949 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-88.ec2.internal\" not found" Apr 17 16:52:36.511641 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:36.511624 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:52:36.532718 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:36.532703 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8tsgm" Apr 17 16:52:36.541245 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:36.541230 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8tsgm" Apr 17 16:52:36.567500 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:36.567458 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1ba083eb7d02ba302027e7be390bc5c.slice/crio-9c1d08a26fd54d428daf02b070b338c8a50598f6016863684073ef4442e242a4 WatchSource:0}: Error finding container 9c1d08a26fd54d428daf02b070b338c8a50598f6016863684073ef4442e242a4: Status 404 returned error can't find the container with id 9c1d08a26fd54d428daf02b070b338c8a50598f6016863684073ef4442e242a4 Apr 17 16:52:36.568098 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:36.568078 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2e5fe7b3b1a217e75d5333a6a0908b9.slice/crio-68facd0bcce03ca28a1fdda5d044c0fbad5ffc48a465ab2a0e6d7c6a667e5a31 WatchSource:0}: Error finding container 68facd0bcce03ca28a1fdda5d044c0fbad5ffc48a465ab2a0e6d7c6a667e5a31: Status 404 returned error can't find the container with id 68facd0bcce03ca28a1fdda5d044c0fbad5ffc48a465ab2a0e6d7c6a667e5a31 Apr 17 16:52:36.572028 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:36.572014 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:52:36.607686 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:36.607668 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-88.ec2.internal\" not found" Apr 17 16:52:36.628412 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:36.628396 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:52:36.636151 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:36.636115 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-88.ec2.internal" event={"ID":"d2e5fe7b3b1a217e75d5333a6a0908b9","Type":"ContainerStarted","Data":"68facd0bcce03ca28a1fdda5d044c0fbad5ffc48a465ab2a0e6d7c6a667e5a31"} Apr 17 16:52:36.637027 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:36.637008 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-88.ec2.internal" event={"ID":"b1ba083eb7d02ba302027e7be390bc5c","Type":"ContainerStarted","Data":"9c1d08a26fd54d428daf02b070b338c8a50598f6016863684073ef4442e242a4"} Apr 17 16:52:36.701129 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:36.701108 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-88.ec2.internal" Apr 17 16:52:36.713125 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:36.713102 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:52:36.714865 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:36.714853 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-88.ec2.internal" Apr 17 16:52:36.723791 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:36.723778 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:52:37.090839 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.090796 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:52:37.331731 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.331701 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:52:37.422489 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.422417 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:52:37.481740 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.481709 2568 apiserver.go:52] "Watching apiserver" Apr 17 16:52:37.489262 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.489238 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 16:52:37.491263 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.491239 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-blsng","openshift-ovn-kubernetes/ovnkube-node-q5p9p","openshift-cluster-node-tuning-operator/tuned-tvkbq","openshift-image-registry/node-ca-ndxqw","openshift-multus/multus-xwgv2","openshift-multus/network-metrics-daemon-bcjnr","openshift-network-operator/iptables-alerter-9gh9v","kube-system/konnectivity-agent-h5kgg","kube-system/kube-apiserver-proxy-ip-10-0-134-88.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t","openshift-dns/node-resolver-pw92n","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-88.ec2.internal","openshift-multus/multus-additional-cni-plugins-ct6wx"] Apr 17 16:52:37.494213 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.494190 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9gh9v" Apr 17 16:52:37.496539 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.496497 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.497066 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.497045 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-54nsz\"" Apr 17 16:52:37.497233 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.497203 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:52:37.497327 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.497245 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 16:52:37.497327 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.497251 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 16:52:37.498780 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.498706 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.500045 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.500018 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 16:52:37.500154 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.500071 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 16:52:37.500210 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.500150 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 16:52:37.500210 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.500193 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 16:52:37.500308 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.500213 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-z9bch\"" Apr 17 16:52:37.500308 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.500297 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 16:52:37.500404 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.500316 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 16:52:37.501042 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.501022 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ndxqw" Apr 17 16:52:37.501176 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.501067 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 16:52:37.501176 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.501099 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:52:37.501264 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.501242 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gd9q2\"" Apr 17 16:52:37.503182 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.503157 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.503348 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.503326 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 16:52:37.503552 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.503530 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 16:52:37.503552 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.503541 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 16:52:37.503673 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.503626 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vj7nk\"" Apr 17 16:52:37.505428 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.505412 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 16:52:37.505525 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.505456 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:37.505525 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.505494 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8kn9t\"" Apr 17 16:52:37.505639 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:37.505518 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcjnr" podUID="1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3" Apr 17 16:52:37.505893 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.505872 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 16:52:37.505999 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.505879 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 16:52:37.506418 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.506403 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 16:52:37.507668 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.507634 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:52:37.507753 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:37.507697 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-blsng" podUID="33d21ed2-8e33-49bf-a161-a1a1a93a72d8" Apr 17 16:52:37.509930 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.509898 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h5kgg" Apr 17 16:52:37.512869 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.512431 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-log-socket\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.512869 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.512579 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-etc-kubernetes\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.512869 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.512622 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-host\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.512869 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.512743 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/66341dd8-b441-446a-be14-71280c6960b2-serviceca\") pod \"node-ca-ndxqw\" (UID: \"66341dd8-b441-446a-be14-71280c6960b2\") " pod="openshift-image-registry/node-ca-ndxqw" Apr 17 16:52:37.512869 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.512828 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-cnibin\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.512869 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.512832 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 16:52:37.513220 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.512955 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-os-release\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.513220 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.513086 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-etc-tuned\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.513220 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.513122 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-etc-kubernetes\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.513220 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.513155 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f461b687-3271-484f-a873-6a5fb0b1214d-ovn-node-metrics-cert\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.513220 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.513187 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bcabc9cc-357b-429a-80c7-605b1281122f-cni-binary-copy\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.513455 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.513238 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-host-run-multus-certs\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.513455 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.513254 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wq4nq\"" Apr 17 16:52:37.513455 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.513264 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-multus-socket-dir-parent\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.513455 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.513308 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-host-slash\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.513455 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.513418 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-host-kubelet\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.513687 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.513479 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-node-log\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.513687 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.513520 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f461b687-3271-484f-a873-6a5fb0b1214d-ovnkube-config\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.513687 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.513589 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66341dd8-b441-446a-be14-71280c6960b2-host\") pod \"node-ca-ndxqw\" (UID: \"66341dd8-b441-446a-be14-71280c6960b2\") " pod="openshift-image-registry/node-ca-ndxqw" Apr 17 16:52:37.513687 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.513650 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 16:52:37.513865 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.513740 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f461b687-3271-484f-a873-6a5fb0b1214d-ovnkube-script-lib\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.513865 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.513794 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-host-run-netns\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.513865 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.513787 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.514036 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.513829 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5qj7\" (UniqueName: \"kubernetes.io/projected/4ee63b4a-1e22-4b4f-a9a3-c0254922b28f-kube-api-access-l5qj7\") pod \"iptables-alerter-9gh9v\" (UID: \"4ee63b4a-1e22-4b4f-a9a3-c0254922b28f\") " pod="openshift-network-operator/iptables-alerter-9gh9v" Apr 17 16:52:37.514036 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.513961 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-run-systemd\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.514036 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.514017 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-host-run-k8s-cni-cncf-io\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.514186 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.514072 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-host-var-lib-kubelet\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.514186 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.514117 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bcabc9cc-357b-429a-80c7-605b1281122f-multus-daemon-config\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.514289 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.514211 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-multus-conf-dir\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.514289 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.514241 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4ee63b4a-1e22-4b4f-a9a3-c0254922b28f-iptables-alerter-script\") pod \"iptables-alerter-9gh9v\" (UID: \"4ee63b4a-1e22-4b4f-a9a3-c0254922b28f\") " pod="openshift-network-operator/iptables-alerter-9gh9v" Apr 17 16:52:37.514386 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.514277 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-etc-modprobe-d\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.514386 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.514331 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-lib-modules\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.514497 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.514447 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-system-cni-dir\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.514497 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.514476 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-host-run-netns\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.514589 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.514537 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f461b687-3271-484f-a873-6a5fb0b1214d-env-overrides\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.514589 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.514576 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-var-lib-kubelet\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.515417 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.514610 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-multus-cni-dir\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.515417 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.514798 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjxxl\" (UniqueName: \"kubernetes.io/projected/bcabc9cc-357b-429a-80c7-605b1281122f-kube-api-access-rjxxl\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.515417 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.514879 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-var-lib-openvswitch\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.515417 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.515040 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2lsf\" (UniqueName: \"kubernetes.io/projected/f461b687-3271-484f-a873-6a5fb0b1214d-kube-api-access-z2lsf\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.515417 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.515145 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrn2x\" (UniqueName: \"kubernetes.io/projected/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-kube-api-access-hrn2x\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.515417 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.515188 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-host-var-lib-cni-bin\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.515417 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.515220 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-etc-sysconfig\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.515417 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.515259 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4ee63b4a-1e22-4b4f-a9a3-c0254922b28f-host-slash\") pod \"iptables-alerter-9gh9v\" (UID: \"4ee63b4a-1e22-4b4f-a9a3-c0254922b28f\") " pod="openshift-network-operator/iptables-alerter-9gh9v" Apr 17 16:52:37.515417 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.515288 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-etc-openvswitch\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.515417 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.515363 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.515417 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.515392 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-etc-systemd\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.515417 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.515423 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-sys\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.516041 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.515452 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-systemd-units\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.516041 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.515481 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-run-ovn\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.516041 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.515510 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-host-cni-bin\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.516041 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.515536 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-run\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.516041 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.515569 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-host-run-ovn-kubernetes\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.516041 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.515656 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-host-cni-netd\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.516041 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.515686 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-etc-sysctl-d\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.516041 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.515751 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qptq\" (UniqueName: \"kubernetes.io/projected/66341dd8-b441-446a-be14-71280c6960b2-kube-api-access-2qptq\") pod \"node-ca-ndxqw\" (UID: \"66341dd8-b441-446a-be14-71280c6960b2\") " pod="openshift-image-registry/node-ca-ndxqw" Apr 17 16:52:37.516041 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.515801 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-run-openvswitch\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.516041 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.515838 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-host-var-lib-cni-multus\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.517038 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.516649 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pw92n" Apr 17 16:52:37.517038 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.516886 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 16:52:37.517038 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.516906 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 16:52:37.517038 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.516994 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 16:52:37.517257 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.517197 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-66wr2\"" Apr 17 16:52:37.517408 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.517381 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-etc-sysctl-conf\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.517485 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.517433 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-tmp\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.517485 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.517476 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-hostroot\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.519078 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.519016 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 16:52:37.519245 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.519232 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-6mljv\"" Apr 17 16:52:37.520125 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.520107 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 16:52:37.520671 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.520654 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.522999 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.522983 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-7dgfk\"" Apr 17 16:52:37.522999 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.522991 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 16:52:37.523125 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.523001 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 16:52:37.542506 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.542486 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:47:36 +0000 UTC" deadline="2027-12-07 19:30:05.478007815 +0000 UTC" Apr 17 16:52:37.542506 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.542505 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14378h37m27.93550462s" Apr 17 16:52:37.602675 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.602488 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 16:52:37.618296 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618269 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-hostroot\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.618459 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618304 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-log-socket\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.618459 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618322 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-etc-kubernetes\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.618459 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618336 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-host\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.618459 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618356 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/66341dd8-b441-446a-be14-71280c6960b2-serviceca\") pod \"node-ca-ndxqw\" (UID: \"66341dd8-b441-446a-be14-71280c6960b2\") " pod="openshift-image-registry/node-ca-ndxqw" Apr 17 16:52:37.618459 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618380 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-cnibin\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.618459 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618385 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-hostroot\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.618459 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618408 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-os-release\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.618459 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618409 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-log-socket\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.618459 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618417 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-etc-kubernetes\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.618459 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618452 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-etc-tuned\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.618459 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618465 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-host\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.618983 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618472 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-etc-kubernetes\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.618983 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618499 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-etc-kubernetes\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.618983 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618501 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-os-release\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.618983 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618507 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/df31caad-3c8d-4eb9-88a9-601343e39692-registration-dir\") pod \"aws-ebs-csi-driver-node-9w74t\" (UID: \"df31caad-3c8d-4eb9-88a9-601343e39692\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.618983 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618545 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/df31caad-3c8d-4eb9-88a9-601343e39692-etc-selinux\") pod \"aws-ebs-csi-driver-node-9w74t\" (UID: \"df31caad-3c8d-4eb9-88a9-601343e39692\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.618983 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618550 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-cnibin\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.618983 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618580 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-system-cni-dir\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.618983 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618636 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scdks\" (UniqueName: \"kubernetes.io/projected/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-kube-api-access-scdks\") pod \"network-metrics-daemon-bcjnr\" (UID: \"1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3\") " pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:37.618983 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618670 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f461b687-3271-484f-a873-6a5fb0b1214d-ovn-node-metrics-cert\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.618983 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618704 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bcabc9cc-357b-429a-80c7-605b1281122f-cni-binary-copy\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.618983 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618736 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-host-run-multus-certs\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.618983 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618762 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lz7r\" (UniqueName: \"kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r\") pod \"network-check-target-blsng\" (UID: \"33d21ed2-8e33-49bf-a161-a1a1a93a72d8\") " pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:52:37.618983 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618797 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-host-run-multus-certs\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.618983 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618789 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-multus-socket-dir-parent\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.618983 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618839 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 16:52:37.618983 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618856 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-host-slash\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.618983 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618889 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/df31caad-3c8d-4eb9-88a9-601343e39692-socket-dir\") pod \"aws-ebs-csi-driver-node-9w74t\" (UID: \"df31caad-3c8d-4eb9-88a9-601343e39692\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.619735 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618866 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/66341dd8-b441-446a-be14-71280c6960b2-serviceca\") pod \"node-ca-ndxqw\" (UID: \"66341dd8-b441-446a-be14-71280c6960b2\") " pod="openshift-image-registry/node-ca-ndxqw" Apr 17 16:52:37.619735 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618959 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.619735 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618965 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-multus-socket-dir-parent\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.619735 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.618980 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs\") pod \"network-metrics-daemon-bcjnr\" (UID: \"1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3\") " pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:37.619735 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619021 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-host-kubelet\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.619735 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619025 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-host-slash\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.619735 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619036 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-node-log\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.619735 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619085 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-host-kubelet\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.619735 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619090 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/747af7cf-1df7-4cfd-8bb2-841945c9fd3e-konnectivity-ca\") pod \"konnectivity-agent-h5kgg\" (UID: \"747af7cf-1df7-4cfd-8bb2-841945c9fd3e\") " pod="kube-system/konnectivity-agent-h5kgg" Apr 17 16:52:37.619735 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619100 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-node-log\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.619735 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619126 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cmmv\" (UniqueName: \"kubernetes.io/projected/df31caad-3c8d-4eb9-88a9-601343e39692-kube-api-access-2cmmv\") pod \"aws-ebs-csi-driver-node-9w74t\" (UID: \"df31caad-3c8d-4eb9-88a9-601343e39692\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.619735 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619160 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f461b687-3271-484f-a873-6a5fb0b1214d-ovnkube-config\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.619735 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619232 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66341dd8-b441-446a-be14-71280c6960b2-host\") pod \"node-ca-ndxqw\" (UID: \"66341dd8-b441-446a-be14-71280c6960b2\") " pod="openshift-image-registry/node-ca-ndxqw" Apr 17 16:52:37.619735 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619290 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bcabc9cc-357b-429a-80c7-605b1281122f-cni-binary-copy\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.619735 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619356 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f461b687-3271-484f-a873-6a5fb0b1214d-ovnkube-script-lib\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.619735 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619386 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-host-run-netns\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.619735 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619414 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5qj7\" (UniqueName: \"kubernetes.io/projected/4ee63b4a-1e22-4b4f-a9a3-c0254922b28f-kube-api-access-l5qj7\") pod \"iptables-alerter-9gh9v\" (UID: \"4ee63b4a-1e22-4b4f-a9a3-c0254922b28f\") " pod="openshift-network-operator/iptables-alerter-9gh9v" Apr 17 16:52:37.620542 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619445 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-run-systemd\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.620542 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619466 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-host-run-k8s-cni-cncf-io\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.620542 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619486 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-host-var-lib-kubelet\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.620542 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619535 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bcabc9cc-357b-429a-80c7-605b1281122f-multus-daemon-config\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.620542 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619554 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-os-release\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.620542 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619579 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.620542 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619594 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-multus-conf-dir\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.620542 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619608 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4ee63b4a-1e22-4b4f-a9a3-c0254922b28f-iptables-alerter-script\") pod \"iptables-alerter-9gh9v\" (UID: \"4ee63b4a-1e22-4b4f-a9a3-c0254922b28f\") " pod="openshift-network-operator/iptables-alerter-9gh9v" Apr 17 16:52:37.620542 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619624 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-etc-modprobe-d\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.620542 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619669 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-lib-modules\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.620542 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619687 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-system-cni-dir\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.620542 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619693 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f461b687-3271-484f-a873-6a5fb0b1214d-ovnkube-config\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.620542 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619700 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-host-run-netns\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.620542 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619715 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f461b687-3271-484f-a873-6a5fb0b1214d-env-overrides\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.620542 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619743 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-var-lib-kubelet\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.620542 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.619360 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66341dd8-b441-446a-be14-71280c6960b2-host\") pod \"node-ca-ndxqw\" (UID: \"66341dd8-b441-446a-be14-71280c6960b2\") " pod="openshift-image-registry/node-ca-ndxqw" Apr 17 16:52:37.620542 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620023 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f461b687-3271-484f-a873-6a5fb0b1214d-ovnkube-script-lib\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.621399 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620089 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f461b687-3271-484f-a873-6a5fb0b1214d-env-overrides\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.621399 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620092 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-host-run-netns\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.621399 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620131 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-multus-conf-dir\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.621399 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620168 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-multus-cni-dir\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.621399 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620186 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bcabc9cc-357b-429a-80c7-605b1281122f-multus-daemon-config\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.621399 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620195 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjxxl\" (UniqueName: \"kubernetes.io/projected/bcabc9cc-357b-429a-80c7-605b1281122f-kube-api-access-rjxxl\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.621399 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620242 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-var-lib-kubelet\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.621399 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620239 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqblq\" (UniqueName: \"kubernetes.io/projected/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-kube-api-access-hqblq\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.621399 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620309 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-run-systemd\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.621399 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620341 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-host-run-k8s-cni-cncf-io\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.621399 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620348 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-var-lib-openvswitch\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.621399 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620387 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2lsf\" (UniqueName: \"kubernetes.io/projected/f461b687-3271-484f-a873-6a5fb0b1214d-kube-api-access-z2lsf\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.621399 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620410 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-lib-modules\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.621399 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620441 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrn2x\" (UniqueName: \"kubernetes.io/projected/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-kube-api-access-hrn2x\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.621399 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620468 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-host-var-lib-cni-bin\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.621399 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620503 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-etc-sysconfig\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.621399 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620527 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4ee63b4a-1e22-4b4f-a9a3-c0254922b28f-host-slash\") pod \"iptables-alerter-9gh9v\" (UID: \"4ee63b4a-1e22-4b4f-a9a3-c0254922b28f\") " pod="openshift-network-operator/iptables-alerter-9gh9v" Apr 17 16:52:37.622189 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620545 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4ee63b4a-1e22-4b4f-a9a3-c0254922b28f-iptables-alerter-script\") pod \"iptables-alerter-9gh9v\" (UID: \"4ee63b4a-1e22-4b4f-a9a3-c0254922b28f\") " pod="openshift-network-operator/iptables-alerter-9gh9v" Apr 17 16:52:37.622189 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620551 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-etc-openvswitch\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.622189 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620578 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.622189 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620585 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-etc-modprobe-d\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.622189 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620592 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-system-cni-dir\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.622189 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620605 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-etc-systemd\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.622189 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620629 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-host-run-netns\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.622189 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620643 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-host-var-lib-kubelet\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.622189 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620662 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-etc-systemd\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.622189 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620666 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-etc-openvswitch\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.622189 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620664 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-var-lib-openvswitch\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.622189 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620705 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4ee63b4a-1e22-4b4f-a9a3-c0254922b28f-host-slash\") pod \"iptables-alerter-9gh9v\" (UID: \"4ee63b4a-1e22-4b4f-a9a3-c0254922b28f\") " pod="openshift-network-operator/iptables-alerter-9gh9v" Apr 17 16:52:37.622189 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620735 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-host-var-lib-cni-bin\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.622189 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620815 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-multus-cni-dir\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.622189 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620841 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-etc-sysconfig\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.622189 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620849 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-sys\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.622189 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620871 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.622189 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620903 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-sys\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.622991 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620902 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/747af7cf-1df7-4cfd-8bb2-841945c9fd3e-agent-certs\") pod \"konnectivity-agent-h5kgg\" (UID: \"747af7cf-1df7-4cfd-8bb2-841945c9fd3e\") " pod="kube-system/konnectivity-agent-h5kgg" Apr 17 16:52:37.622991 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620953 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/df31caad-3c8d-4eb9-88a9-601343e39692-device-dir\") pod \"aws-ebs-csi-driver-node-9w74t\" (UID: \"df31caad-3c8d-4eb9-88a9-601343e39692\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.622991 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.620989 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-systemd-units\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.622991 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621022 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-run-ovn\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.622991 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621047 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-systemd-units\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.622991 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621052 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-host-cni-bin\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.622991 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621091 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-run\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.622991 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621100 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-run-ovn\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.622991 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621095 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-host-cni-bin\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.622991 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621118 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-host-run-ovn-kubernetes\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.622991 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621131 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-run\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.622991 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621137 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-host-run-ovn-kubernetes\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.622991 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621149 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-host-cni-netd\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.622991 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621175 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-etc-sysctl-d\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.622991 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621205 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qptq\" (UniqueName: \"kubernetes.io/projected/66341dd8-b441-446a-be14-71280c6960b2-kube-api-access-2qptq\") pod \"node-ca-ndxqw\" (UID: \"66341dd8-b441-446a-be14-71280c6960b2\") " pod="openshift-image-registry/node-ca-ndxqw" Apr 17 16:52:37.622991 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621238 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-host-cni-netd\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.622991 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621270 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df31caad-3c8d-4eb9-88a9-601343e39692-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9w74t\" (UID: \"df31caad-3c8d-4eb9-88a9-601343e39692\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.623752 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621296 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-cnibin\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.623752 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621322 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/205623d6-59f4-4e27-8196-83daaf7a9d26-hosts-file\") pod \"node-resolver-pw92n\" (UID: \"205623d6-59f4-4e27-8196-83daaf7a9d26\") " pod="openshift-dns/node-resolver-pw92n" Apr 17 16:52:37.623752 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621312 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-etc-sysctl-d\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.623752 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621352 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlcb4\" (UniqueName: \"kubernetes.io/projected/205623d6-59f4-4e27-8196-83daaf7a9d26-kube-api-access-hlcb4\") pod \"node-resolver-pw92n\" (UID: \"205623d6-59f4-4e27-8196-83daaf7a9d26\") " pod="openshift-dns/node-resolver-pw92n" Apr 17 16:52:37.623752 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621404 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-run-openvswitch\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.623752 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621431 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-host-var-lib-cni-multus\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.623752 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621452 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f461b687-3271-484f-a873-6a5fb0b1214d-run-openvswitch\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.623752 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621457 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.623752 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621484 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/205623d6-59f4-4e27-8196-83daaf7a9d26-tmp-dir\") pod \"node-resolver-pw92n\" (UID: \"205623d6-59f4-4e27-8196-83daaf7a9d26\") " pod="openshift-dns/node-resolver-pw92n" Apr 17 16:52:37.623752 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621492 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bcabc9cc-357b-429a-80c7-605b1281122f-host-var-lib-cni-multus\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.623752 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621515 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-etc-sysctl-conf\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.623752 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621537 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-tmp\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.623752 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621553 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/df31caad-3c8d-4eb9-88a9-601343e39692-sys-fs\") pod \"aws-ebs-csi-driver-node-9w74t\" (UID: \"df31caad-3c8d-4eb9-88a9-601343e39692\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.623752 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621577 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-cni-binary-copy\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.623752 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.621651 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-etc-sysctl-conf\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.623752 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.622483 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f461b687-3271-484f-a873-6a5fb0b1214d-ovn-node-metrics-cert\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.623752 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.622895 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-etc-tuned\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.624537 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.623887 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-tmp\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.633384 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.633134 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5qj7\" (UniqueName: \"kubernetes.io/projected/4ee63b4a-1e22-4b4f-a9a3-c0254922b28f-kube-api-access-l5qj7\") pod \"iptables-alerter-9gh9v\" (UID: \"4ee63b4a-1e22-4b4f-a9a3-c0254922b28f\") " pod="openshift-network-operator/iptables-alerter-9gh9v" Apr 17 16:52:37.633384 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.633183 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjxxl\" (UniqueName: \"kubernetes.io/projected/bcabc9cc-357b-429a-80c7-605b1281122f-kube-api-access-rjxxl\") pod \"multus-xwgv2\" (UID: \"bcabc9cc-357b-429a-80c7-605b1281122f\") " pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.633551 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.633384 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrn2x\" (UniqueName: \"kubernetes.io/projected/0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79-kube-api-access-hrn2x\") pod \"tuned-tvkbq\" (UID: \"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79\") " pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.633829 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.633783 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2lsf\" (UniqueName: \"kubernetes.io/projected/f461b687-3271-484f-a873-6a5fb0b1214d-kube-api-access-z2lsf\") pod \"ovnkube-node-q5p9p\" (UID: \"f461b687-3271-484f-a873-6a5fb0b1214d\") " pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.633946 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.633914 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qptq\" (UniqueName: \"kubernetes.io/projected/66341dd8-b441-446a-be14-71280c6960b2-kube-api-access-2qptq\") pod \"node-ca-ndxqw\" (UID: \"66341dd8-b441-446a-be14-71280c6960b2\") " pod="openshift-image-registry/node-ca-ndxqw" Apr 17 16:52:37.722757 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.722674 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqblq\" (UniqueName: \"kubernetes.io/projected/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-kube-api-access-hqblq\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.722757 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.722721 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/747af7cf-1df7-4cfd-8bb2-841945c9fd3e-agent-certs\") pod \"konnectivity-agent-h5kgg\" (UID: \"747af7cf-1df7-4cfd-8bb2-841945c9fd3e\") " pod="kube-system/konnectivity-agent-h5kgg" Apr 17 16:52:37.722757 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.722747 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/df31caad-3c8d-4eb9-88a9-601343e39692-device-dir\") pod \"aws-ebs-csi-driver-node-9w74t\" (UID: \"df31caad-3c8d-4eb9-88a9-601343e39692\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.723051 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.722768 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df31caad-3c8d-4eb9-88a9-601343e39692-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9w74t\" (UID: \"df31caad-3c8d-4eb9-88a9-601343e39692\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.723051 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.722785 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-cnibin\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.723051 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.722801 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/205623d6-59f4-4e27-8196-83daaf7a9d26-hosts-file\") pod \"node-resolver-pw92n\" (UID: \"205623d6-59f4-4e27-8196-83daaf7a9d26\") " pod="openshift-dns/node-resolver-pw92n" Apr 17 16:52:37.723051 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.722821 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlcb4\" (UniqueName: \"kubernetes.io/projected/205623d6-59f4-4e27-8196-83daaf7a9d26-kube-api-access-hlcb4\") pod \"node-resolver-pw92n\" (UID: \"205623d6-59f4-4e27-8196-83daaf7a9d26\") " pod="openshift-dns/node-resolver-pw92n" Apr 17 16:52:37.723051 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.722853 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/df31caad-3c8d-4eb9-88a9-601343e39692-device-dir\") pod \"aws-ebs-csi-driver-node-9w74t\" (UID: \"df31caad-3c8d-4eb9-88a9-601343e39692\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.723051 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.722876 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-cnibin\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.723051 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.722884 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df31caad-3c8d-4eb9-88a9-601343e39692-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9w74t\" (UID: \"df31caad-3c8d-4eb9-88a9-601343e39692\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.723051 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.722903 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/205623d6-59f4-4e27-8196-83daaf7a9d26-hosts-file\") pod \"node-resolver-pw92n\" (UID: \"205623d6-59f4-4e27-8196-83daaf7a9d26\") " pod="openshift-dns/node-resolver-pw92n" Apr 17 16:52:37.723051 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.722946 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.723051 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.722981 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/205623d6-59f4-4e27-8196-83daaf7a9d26-tmp-dir\") pod \"node-resolver-pw92n\" (UID: \"205623d6-59f4-4e27-8196-83daaf7a9d26\") " pod="openshift-dns/node-resolver-pw92n" Apr 17 16:52:37.723051 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723008 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/df31caad-3c8d-4eb9-88a9-601343e39692-sys-fs\") pod \"aws-ebs-csi-driver-node-9w74t\" (UID: \"df31caad-3c8d-4eb9-88a9-601343e39692\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.723051 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723034 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-cni-binary-copy\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.723615 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723110 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/df31caad-3c8d-4eb9-88a9-601343e39692-sys-fs\") pod \"aws-ebs-csi-driver-node-9w74t\" (UID: \"df31caad-3c8d-4eb9-88a9-601343e39692\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.723615 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723158 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/df31caad-3c8d-4eb9-88a9-601343e39692-registration-dir\") pod \"aws-ebs-csi-driver-node-9w74t\" (UID: \"df31caad-3c8d-4eb9-88a9-601343e39692\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.723615 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723188 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/df31caad-3c8d-4eb9-88a9-601343e39692-etc-selinux\") pod \"aws-ebs-csi-driver-node-9w74t\" (UID: \"df31caad-3c8d-4eb9-88a9-601343e39692\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.723615 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723218 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-system-cni-dir\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.723615 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723228 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/df31caad-3c8d-4eb9-88a9-601343e39692-registration-dir\") pod \"aws-ebs-csi-driver-node-9w74t\" (UID: \"df31caad-3c8d-4eb9-88a9-601343e39692\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.723615 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723243 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-scdks\" (UniqueName: \"kubernetes.io/projected/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-kube-api-access-scdks\") pod \"network-metrics-daemon-bcjnr\" (UID: \"1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3\") " pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:37.723615 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723273 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lz7r\" (UniqueName: \"kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r\") pod \"network-check-target-blsng\" (UID: \"33d21ed2-8e33-49bf-a161-a1a1a93a72d8\") " pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:52:37.723615 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723286 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/df31caad-3c8d-4eb9-88a9-601343e39692-etc-selinux\") pod \"aws-ebs-csi-driver-node-9w74t\" (UID: \"df31caad-3c8d-4eb9-88a9-601343e39692\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.723615 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723302 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/df31caad-3c8d-4eb9-88a9-601343e39692-socket-dir\") pod \"aws-ebs-csi-driver-node-9w74t\" (UID: \"df31caad-3c8d-4eb9-88a9-601343e39692\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.723615 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723331 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.723615 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723336 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/205623d6-59f4-4e27-8196-83daaf7a9d26-tmp-dir\") pod \"node-resolver-pw92n\" (UID: \"205623d6-59f4-4e27-8196-83daaf7a9d26\") " pod="openshift-dns/node-resolver-pw92n" Apr 17 16:52:37.723615 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723345 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-system-cni-dir\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.723615 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723358 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs\") pod \"network-metrics-daemon-bcjnr\" (UID: \"1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3\") " pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:37.723615 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723387 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/747af7cf-1df7-4cfd-8bb2-841945c9fd3e-konnectivity-ca\") pod \"konnectivity-agent-h5kgg\" (UID: \"747af7cf-1df7-4cfd-8bb2-841945c9fd3e\") " pod="kube-system/konnectivity-agent-h5kgg" Apr 17 16:52:37.723615 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723413 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cmmv\" (UniqueName: \"kubernetes.io/projected/df31caad-3c8d-4eb9-88a9-601343e39692-kube-api-access-2cmmv\") pod \"aws-ebs-csi-driver-node-9w74t\" (UID: \"df31caad-3c8d-4eb9-88a9-601343e39692\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.723615 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723413 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.724447 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723446 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-os-release\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.724447 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723472 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.724447 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:37.723501 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:37.724447 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723550 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-cni-binary-copy\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.724447 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723560 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/df31caad-3c8d-4eb9-88a9-601343e39692-socket-dir\") pod \"aws-ebs-csi-driver-node-9w74t\" (UID: \"df31caad-3c8d-4eb9-88a9-601343e39692\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.724447 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:37.723575 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs podName:1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:38.223553652 +0000 UTC m=+3.130438464 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs") pod "network-metrics-daemon-bcjnr" (UID: "1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:37.724447 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723635 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-os-release\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.724447 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.723686 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.724447 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.724039 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/747af7cf-1df7-4cfd-8bb2-841945c9fd3e-konnectivity-ca\") pod \"konnectivity-agent-h5kgg\" (UID: \"747af7cf-1df7-4cfd-8bb2-841945c9fd3e\") " pod="kube-system/konnectivity-agent-h5kgg" Apr 17 16:52:37.724447 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.724401 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.725938 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.725899 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/747af7cf-1df7-4cfd-8bb2-841945c9fd3e-agent-certs\") pod \"konnectivity-agent-h5kgg\" (UID: \"747af7cf-1df7-4cfd-8bb2-841945c9fd3e\") " pod="kube-system/konnectivity-agent-h5kgg" Apr 17 16:52:37.730170 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:37.730141 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:52:37.730170 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:37.730165 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:52:37.730307 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:37.730178 2568 projected.go:194] Error preparing data for projected volume kube-api-access-5lz7r for pod openshift-network-diagnostics/network-check-target-blsng: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:37.730307 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:37.730241 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r podName:33d21ed2-8e33-49bf-a161-a1a1a93a72d8 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:38.230222192 +0000 UTC m=+3.137106990 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5lz7r" (UniqueName: "kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r") pod "network-check-target-blsng" (UID: "33d21ed2-8e33-49bf-a161-a1a1a93a72d8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:37.732400 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.732375 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cmmv\" (UniqueName: \"kubernetes.io/projected/df31caad-3c8d-4eb9-88a9-601343e39692-kube-api-access-2cmmv\") pod \"aws-ebs-csi-driver-node-9w74t\" (UID: \"df31caad-3c8d-4eb9-88a9-601343e39692\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.733273 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.733248 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqblq\" (UniqueName: \"kubernetes.io/projected/561a91fc-d084-4461-b5bf-ad5bc1ac7a9e-kube-api-access-hqblq\") pod \"multus-additional-cni-plugins-ct6wx\" (UID: \"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e\") " pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:37.733556 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.733537 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlcb4\" (UniqueName: \"kubernetes.io/projected/205623d6-59f4-4e27-8196-83daaf7a9d26-kube-api-access-hlcb4\") pod \"node-resolver-pw92n\" (UID: \"205623d6-59f4-4e27-8196-83daaf7a9d26\") " pod="openshift-dns/node-resolver-pw92n" Apr 17 16:52:37.734040 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.734023 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-scdks\" (UniqueName: \"kubernetes.io/projected/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-kube-api-access-scdks\") pod \"network-metrics-daemon-bcjnr\" (UID: \"1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3\") " pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:37.807163 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.807131 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9gh9v" Apr 17 16:52:37.816904 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.816883 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:52:37.824605 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.824586 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" Apr 17 16:52:37.829838 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.829819 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ndxqw" Apr 17 16:52:37.836409 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.836385 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xwgv2" Apr 17 16:52:37.841974 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.841960 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h5kgg" Apr 17 16:52:37.848495 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.848480 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" Apr 17 16:52:37.854992 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.854972 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pw92n" Apr 17 16:52:37.859506 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:37.859488 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ct6wx" Apr 17 16:52:38.226779 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:38.226748 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs\") pod \"network-metrics-daemon-bcjnr\" (UID: \"1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3\") " pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:38.226931 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:38.226865 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:38.226977 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:38.226951 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs podName:1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:39.226934911 +0000 UTC m=+4.133819719 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs") pod "network-metrics-daemon-bcjnr" (UID: "1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:38.247309 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:38.247261 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod205623d6_59f4_4e27_8196_83daaf7a9d26.slice/crio-c110c2f089fe8f05f04f1ddb78c802c141a0747ef006a41ed069deb3935ea8b4 WatchSource:0}: Error finding container c110c2f089fe8f05f04f1ddb78c802c141a0747ef006a41ed069deb3935ea8b4: Status 404 returned error can't find the container with id c110c2f089fe8f05f04f1ddb78c802c141a0747ef006a41ed069deb3935ea8b4 Apr 17 16:52:38.248687 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:38.248542 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod561a91fc_d084_4461_b5bf_ad5bc1ac7a9e.slice/crio-fa45a809ec5f6173ef8fc8269abea8067c499f1763912a0d22406abea8c84e18 WatchSource:0}: Error finding container fa45a809ec5f6173ef8fc8269abea8067c499f1763912a0d22406abea8c84e18: Status 404 returned error can't find the container with id fa45a809ec5f6173ef8fc8269abea8067c499f1763912a0d22406abea8c84e18 Apr 17 16:52:38.249976 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:38.249828 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e1ddd0a_ef10_44ae_a5c9_d1857ba5dd79.slice/crio-4c99ac5a43a81763ea5ac3bfe5987f7406238e84e31b0a676767e0a03ede8b0f WatchSource:0}: Error finding container 4c99ac5a43a81763ea5ac3bfe5987f7406238e84e31b0a676767e0a03ede8b0f: Status 404 returned error can't find the container with id 4c99ac5a43a81763ea5ac3bfe5987f7406238e84e31b0a676767e0a03ede8b0f Apr 17 16:52:38.250749 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:38.250714 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ee63b4a_1e22_4b4f_a9a3_c0254922b28f.slice/crio-41c9936f38907b9365738c0c0f13f9371aba9030bf0c413a85a5d1b8584fc477 WatchSource:0}: Error finding container 41c9936f38907b9365738c0c0f13f9371aba9030bf0c413a85a5d1b8584fc477: Status 404 returned error can't find the container with id 41c9936f38907b9365738c0c0f13f9371aba9030bf0c413a85a5d1b8584fc477 Apr 17 16:52:38.252015 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:38.251735 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcabc9cc_357b_429a_80c7_605b1281122f.slice/crio-8f78ec129d51339af5298b3f4ec403d1f9c7eba584569e391cb9acc072c579d3 WatchSource:0}: Error finding container 8f78ec129d51339af5298b3f4ec403d1f9c7eba584569e391cb9acc072c579d3: Status 404 returned error can't find the container with id 8f78ec129d51339af5298b3f4ec403d1f9c7eba584569e391cb9acc072c579d3 Apr 17 16:52:38.252777 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:38.252676 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66341dd8_b441_446a_be14_71280c6960b2.slice/crio-f5144ecb77a706f75bd1605f035172f6922d5e8508371d8df4b1fbc84709f238 WatchSource:0}: Error finding container f5144ecb77a706f75bd1605f035172f6922d5e8508371d8df4b1fbc84709f238: Status 404 returned error can't find the container with id f5144ecb77a706f75bd1605f035172f6922d5e8508371d8df4b1fbc84709f238 Apr 17 16:52:38.254132 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:38.254108 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf461b687_3271_484f_a873_6a5fb0b1214d.slice/crio-9f384ff797f87e960ca4a0d96457f461276c4dbfc1c2283051c8f021cc2e8293 WatchSource:0}: Error finding container 9f384ff797f87e960ca4a0d96457f461276c4dbfc1c2283051c8f021cc2e8293: Status 404 returned error can't find the container with id 9f384ff797f87e960ca4a0d96457f461276c4dbfc1c2283051c8f021cc2e8293 Apr 17 16:52:38.255666 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:38.255090 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod747af7cf_1df7_4cfd_8bb2_841945c9fd3e.slice/crio-e8ac212f2d946451fe39bb04038bf58d81aa7a3fc95376d51a955f120e6b6d54 WatchSource:0}: Error finding container e8ac212f2d946451fe39bb04038bf58d81aa7a3fc95376d51a955f120e6b6d54: Status 404 returned error can't find the container with id e8ac212f2d946451fe39bb04038bf58d81aa7a3fc95376d51a955f120e6b6d54 Apr 17 16:52:38.256006 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:52:38.255895 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf31caad_3c8d_4eb9_88a9_601343e39692.slice/crio-c2949c0b0c17dce042c0ec7925ce4ff1d948639e5741e0b74655bc0d1ef876ec WatchSource:0}: Error finding container c2949c0b0c17dce042c0ec7925ce4ff1d948639e5741e0b74655bc0d1ef876ec: Status 404 returned error can't find the container with id c2949c0b0c17dce042c0ec7925ce4ff1d948639e5741e0b74655bc0d1ef876ec Apr 17 16:52:38.327726 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:38.327699 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lz7r\" (UniqueName: \"kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r\") pod \"network-check-target-blsng\" (UID: \"33d21ed2-8e33-49bf-a161-a1a1a93a72d8\") " pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:52:38.327845 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:38.327831 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:52:38.327884 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:38.327851 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:52:38.327884 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:38.327864 2568 projected.go:194] Error preparing data for projected volume kube-api-access-5lz7r for pod openshift-network-diagnostics/network-check-target-blsng: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:38.327961 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:38.327954 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r podName:33d21ed2-8e33-49bf-a161-a1a1a93a72d8 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:39.327902711 +0000 UTC m=+4.234787512 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5lz7r" (UniqueName: "kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r") pod "network-check-target-blsng" (UID: "33d21ed2-8e33-49bf-a161-a1a1a93a72d8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:38.542830 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:38.542748 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:47:36 +0000 UTC" deadline="2027-10-16 15:29:29.776348901 +0000 UTC" Apr 17 16:52:38.542830 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:38.542778 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13126h36m51.233573362s" Apr 17 16:52:38.649605 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:38.649543 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" event={"ID":"df31caad-3c8d-4eb9-88a9-601343e39692","Type":"ContainerStarted","Data":"c2949c0b0c17dce042c0ec7925ce4ff1d948639e5741e0b74655bc0d1ef876ec"} Apr 17 16:52:38.651649 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:38.651619 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h5kgg" event={"ID":"747af7cf-1df7-4cfd-8bb2-841945c9fd3e","Type":"ContainerStarted","Data":"e8ac212f2d946451fe39bb04038bf58d81aa7a3fc95376d51a955f120e6b6d54"} Apr 17 16:52:38.653289 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:38.653263 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" event={"ID":"f461b687-3271-484f-a873-6a5fb0b1214d","Type":"ContainerStarted","Data":"9f384ff797f87e960ca4a0d96457f461276c4dbfc1c2283051c8f021cc2e8293"} Apr 17 16:52:38.656904 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:38.656861 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xwgv2" event={"ID":"bcabc9cc-357b-429a-80c7-605b1281122f","Type":"ContainerStarted","Data":"8f78ec129d51339af5298b3f4ec403d1f9c7eba584569e391cb9acc072c579d3"} Apr 17 16:52:38.658648 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:38.658623 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9gh9v" event={"ID":"4ee63b4a-1e22-4b4f-a9a3-c0254922b28f","Type":"ContainerStarted","Data":"41c9936f38907b9365738c0c0f13f9371aba9030bf0c413a85a5d1b8584fc477"} Apr 17 16:52:38.661841 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:38.661272 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-88.ec2.internal" event={"ID":"b1ba083eb7d02ba302027e7be390bc5c","Type":"ContainerStarted","Data":"75775cadc6acb3fe81da18441f0b974b9b10f74ced830bfaa2b40f1344a6ba08"} Apr 17 16:52:38.663661 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:38.663597 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ndxqw" event={"ID":"66341dd8-b441-446a-be14-71280c6960b2","Type":"ContainerStarted","Data":"f5144ecb77a706f75bd1605f035172f6922d5e8508371d8df4b1fbc84709f238"} Apr 17 16:52:38.665239 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:38.665214 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" event={"ID":"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79","Type":"ContainerStarted","Data":"4c99ac5a43a81763ea5ac3bfe5987f7406238e84e31b0a676767e0a03ede8b0f"} Apr 17 16:52:38.666744 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:38.666722 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ct6wx" event={"ID":"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e","Type":"ContainerStarted","Data":"fa45a809ec5f6173ef8fc8269abea8067c499f1763912a0d22406abea8c84e18"} Apr 17 16:52:38.672287 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:38.672265 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pw92n" event={"ID":"205623d6-59f4-4e27-8196-83daaf7a9d26","Type":"ContainerStarted","Data":"c110c2f089fe8f05f04f1ddb78c802c141a0747ef006a41ed069deb3935ea8b4"} Apr 17 16:52:39.233637 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:39.233552 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs\") pod \"network-metrics-daemon-bcjnr\" (UID: \"1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3\") " pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:39.233798 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:39.233702 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:39.233798 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:39.233766 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs podName:1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:41.233746244 +0000 UTC m=+6.140631044 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs") pod "network-metrics-daemon-bcjnr" (UID: "1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:39.334895 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:39.334238 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lz7r\" (UniqueName: \"kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r\") pod \"network-check-target-blsng\" (UID: \"33d21ed2-8e33-49bf-a161-a1a1a93a72d8\") " pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:52:39.334895 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:39.334474 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:52:39.334895 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:39.334495 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:52:39.334895 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:39.334508 2568 projected.go:194] Error preparing data for projected volume kube-api-access-5lz7r for pod openshift-network-diagnostics/network-check-target-blsng: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:39.334895 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:39.334563 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r podName:33d21ed2-8e33-49bf-a161-a1a1a93a72d8 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:41.334545086 +0000 UTC m=+6.241429884 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-5lz7r" (UniqueName: "kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r") pod "network-check-target-blsng" (UID: "33d21ed2-8e33-49bf-a161-a1a1a93a72d8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:39.636819 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:39.636744 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:39.637263 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:39.636884 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcjnr" podUID="1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3" Apr 17 16:52:39.637330 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:39.637316 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:52:39.637503 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:39.637402 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-blsng" podUID="33d21ed2-8e33-49bf-a161-a1a1a93a72d8" Apr 17 16:52:39.696764 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:39.696704 2568 generic.go:358] "Generic (PLEG): container finished" podID="d2e5fe7b3b1a217e75d5333a6a0908b9" containerID="77f541962d5c8660b52540d506368b453e6953d3ad621a4097b33256107e2b92" exitCode=0 Apr 17 16:52:39.697666 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:39.697642 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-88.ec2.internal" event={"ID":"d2e5fe7b3b1a217e75d5333a6a0908b9","Type":"ContainerDied","Data":"77f541962d5c8660b52540d506368b453e6953d3ad621a4097b33256107e2b92"} Apr 17 16:52:39.720097 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:39.720045 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-88.ec2.internal" podStartSLOduration=3.720024268 podStartE2EDuration="3.720024268s" podCreationTimestamp="2026-04-17 16:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:52:38.676878292 +0000 UTC m=+3.583763109" watchObservedRunningTime="2026-04-17 16:52:39.720024268 +0000 UTC m=+4.626909084" Apr 17 16:52:40.709674 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:40.709638 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-88.ec2.internal" event={"ID":"d2e5fe7b3b1a217e75d5333a6a0908b9","Type":"ContainerStarted","Data":"a0fefc94eca4539b243e22b89e3690f01bab7c9472b71ca4aa566bb39e888601"} Apr 17 16:52:40.725616 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:40.725565 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-88.ec2.internal" podStartSLOduration=4.725547334 podStartE2EDuration="4.725547334s" podCreationTimestamp="2026-04-17 16:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:52:40.725114414 +0000 UTC m=+5.631999231" watchObservedRunningTime="2026-04-17 16:52:40.725547334 +0000 UTC m=+5.632432152" Apr 17 16:52:41.255493 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:41.255437 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs\") pod \"network-metrics-daemon-bcjnr\" (UID: \"1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3\") " pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:41.255670 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:41.255604 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:41.255740 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:41.255675 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs podName:1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:45.255654536 +0000 UTC m=+10.162539346 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs") pod "network-metrics-daemon-bcjnr" (UID: "1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:41.356271 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:41.356236 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lz7r\" (UniqueName: \"kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r\") pod \"network-check-target-blsng\" (UID: \"33d21ed2-8e33-49bf-a161-a1a1a93a72d8\") " pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:52:41.356462 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:41.356438 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:52:41.356530 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:41.356471 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:52:41.356530 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:41.356485 2568 projected.go:194] Error preparing data for projected volume kube-api-access-5lz7r for pod openshift-network-diagnostics/network-check-target-blsng: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:41.356628 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:41.356557 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r podName:33d21ed2-8e33-49bf-a161-a1a1a93a72d8 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:45.356535788 +0000 UTC m=+10.263420583 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-5lz7r" (UniqueName: "kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r") pod "network-check-target-blsng" (UID: "33d21ed2-8e33-49bf-a161-a1a1a93a72d8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:41.420996 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:41.420150 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-jgf62"] Apr 17 16:52:41.423806 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:41.423289 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:41.423806 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:41.423396 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jgf62" podUID="e15ee7f9-dfd3-4121-89e2-a4eefd35413e" Apr 17 16:52:41.457208 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:41.457144 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-original-pull-secret\") pod \"global-pull-secret-syncer-jgf62\" (UID: \"e15ee7f9-dfd3-4121-89e2-a4eefd35413e\") " pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:41.457208 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:41.457202 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-kubelet-config\") pod \"global-pull-secret-syncer-jgf62\" (UID: \"e15ee7f9-dfd3-4121-89e2-a4eefd35413e\") " pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:41.457386 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:41.457253 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-dbus\") pod \"global-pull-secret-syncer-jgf62\" (UID: \"e15ee7f9-dfd3-4121-89e2-a4eefd35413e\") " pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:41.558408 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:41.558296 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-kubelet-config\") pod \"global-pull-secret-syncer-jgf62\" (UID: \"e15ee7f9-dfd3-4121-89e2-a4eefd35413e\") " pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:41.558408 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:41.558363 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-dbus\") pod \"global-pull-secret-syncer-jgf62\" (UID: \"e15ee7f9-dfd3-4121-89e2-a4eefd35413e\") " pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:41.558622 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:41.558451 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-dbus\") pod \"global-pull-secret-syncer-jgf62\" (UID: \"e15ee7f9-dfd3-4121-89e2-a4eefd35413e\") " pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:41.558622 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:41.558460 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-kubelet-config\") pod \"global-pull-secret-syncer-jgf62\" (UID: \"e15ee7f9-dfd3-4121-89e2-a4eefd35413e\") " pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:41.558622 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:41.558479 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-original-pull-secret\") pod \"global-pull-secret-syncer-jgf62\" (UID: \"e15ee7f9-dfd3-4121-89e2-a4eefd35413e\") " pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:41.558775 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:41.558624 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:41.558775 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:41.558701 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-original-pull-secret podName:e15ee7f9-dfd3-4121-89e2-a4eefd35413e nodeName:}" failed. No retries permitted until 2026-04-17 16:52:42.058682228 +0000 UTC m=+6.965567023 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-original-pull-secret") pod "global-pull-secret-syncer-jgf62" (UID: "e15ee7f9-dfd3-4121-89e2-a4eefd35413e") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:41.635212 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:41.635174 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:41.635393 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:41.635321 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcjnr" podUID="1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3" Apr 17 16:52:41.635461 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:41.635402 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:52:41.635549 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:41.635516 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-blsng" podUID="33d21ed2-8e33-49bf-a161-a1a1a93a72d8" Apr 17 16:52:42.061531 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:42.061496 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-original-pull-secret\") pod \"global-pull-secret-syncer-jgf62\" (UID: \"e15ee7f9-dfd3-4121-89e2-a4eefd35413e\") " pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:42.062099 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:42.061678 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:42.062099 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:42.061738 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-original-pull-secret podName:e15ee7f9-dfd3-4121-89e2-a4eefd35413e nodeName:}" failed. No retries permitted until 2026-04-17 16:52:43.061719751 +0000 UTC m=+7.968604551 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-original-pull-secret") pod "global-pull-secret-syncer-jgf62" (UID: "e15ee7f9-dfd3-4121-89e2-a4eefd35413e") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:42.633898 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:42.633867 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:42.634069 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:42.634018 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jgf62" podUID="e15ee7f9-dfd3-4121-89e2-a4eefd35413e" Apr 17 16:52:43.069967 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:43.069873 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-original-pull-secret\") pod \"global-pull-secret-syncer-jgf62\" (UID: \"e15ee7f9-dfd3-4121-89e2-a4eefd35413e\") " pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:43.070371 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:43.070149 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:43.070371 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:43.070213 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-original-pull-secret podName:e15ee7f9-dfd3-4121-89e2-a4eefd35413e nodeName:}" failed. No retries permitted until 2026-04-17 16:52:45.070194436 +0000 UTC m=+9.977079251 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-original-pull-secret") pod "global-pull-secret-syncer-jgf62" (UID: "e15ee7f9-dfd3-4121-89e2-a4eefd35413e") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:43.634256 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:43.633653 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:43.634256 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:43.633674 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:52:43.634256 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:43.633795 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcjnr" podUID="1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3" Apr 17 16:52:43.634256 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:43.634207 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-blsng" podUID="33d21ed2-8e33-49bf-a161-a1a1a93a72d8" Apr 17 16:52:44.634558 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:44.634526 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:44.635044 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:44.634659 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jgf62" podUID="e15ee7f9-dfd3-4121-89e2-a4eefd35413e" Apr 17 16:52:45.084094 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:45.084012 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-original-pull-secret\") pod \"global-pull-secret-syncer-jgf62\" (UID: \"e15ee7f9-dfd3-4121-89e2-a4eefd35413e\") " pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:45.084263 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:45.084199 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:45.084263 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:45.084260 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-original-pull-secret podName:e15ee7f9-dfd3-4121-89e2-a4eefd35413e nodeName:}" failed. No retries permitted until 2026-04-17 16:52:49.084241032 +0000 UTC m=+13.991125832 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-original-pull-secret") pod "global-pull-secret-syncer-jgf62" (UID: "e15ee7f9-dfd3-4121-89e2-a4eefd35413e") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:45.286103 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:45.285605 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs\") pod \"network-metrics-daemon-bcjnr\" (UID: \"1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3\") " pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:45.286103 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:45.285771 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:45.286103 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:45.285828 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs podName:1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:53.285809523 +0000 UTC m=+18.192694320 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs") pod "network-metrics-daemon-bcjnr" (UID: "1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:45.386668 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:45.386586 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lz7r\" (UniqueName: \"kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r\") pod \"network-check-target-blsng\" (UID: \"33d21ed2-8e33-49bf-a161-a1a1a93a72d8\") " pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:52:45.386825 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:45.386762 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:52:45.386825 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:45.386784 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:52:45.386825 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:45.386796 2568 projected.go:194] Error preparing data for projected volume kube-api-access-5lz7r for pod openshift-network-diagnostics/network-check-target-blsng: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:45.386978 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:45.386867 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r podName:33d21ed2-8e33-49bf-a161-a1a1a93a72d8 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:53.386848921 +0000 UTC m=+18.293733720 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-5lz7r" (UniqueName: "kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r") pod "network-check-target-blsng" (UID: "33d21ed2-8e33-49bf-a161-a1a1a93a72d8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:45.635395 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:45.634780 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:52:45.635395 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:45.634884 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-blsng" podUID="33d21ed2-8e33-49bf-a161-a1a1a93a72d8" Apr 17 16:52:45.635395 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:45.635251 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:45.635395 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:45.635351 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcjnr" podUID="1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3" Apr 17 16:52:46.634093 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:46.634059 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:46.634284 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:46.634201 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jgf62" podUID="e15ee7f9-dfd3-4121-89e2-a4eefd35413e" Apr 17 16:52:47.634952 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:47.634365 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:47.634952 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:47.634379 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:52:47.634952 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:47.634564 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-blsng" podUID="33d21ed2-8e33-49bf-a161-a1a1a93a72d8" Apr 17 16:52:47.634952 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:47.634711 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcjnr" podUID="1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3" Apr 17 16:52:48.634708 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:48.634676 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:48.634906 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:48.634780 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jgf62" podUID="e15ee7f9-dfd3-4121-89e2-a4eefd35413e" Apr 17 16:52:49.115680 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:49.115595 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-original-pull-secret\") pod \"global-pull-secret-syncer-jgf62\" (UID: \"e15ee7f9-dfd3-4121-89e2-a4eefd35413e\") " pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:49.116129 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:49.115734 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:49.116129 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:49.115793 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-original-pull-secret podName:e15ee7f9-dfd3-4121-89e2-a4eefd35413e nodeName:}" failed. No retries permitted until 2026-04-17 16:52:57.11577549 +0000 UTC m=+22.022660284 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-original-pull-secret") pod "global-pull-secret-syncer-jgf62" (UID: "e15ee7f9-dfd3-4121-89e2-a4eefd35413e") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:49.634407 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:49.634365 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:49.634587 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:49.634373 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:52:49.634587 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:49.634508 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcjnr" podUID="1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3" Apr 17 16:52:49.634587 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:49.634567 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-blsng" podUID="33d21ed2-8e33-49bf-a161-a1a1a93a72d8" Apr 17 16:52:50.634397 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:50.634348 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:50.634846 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:50.634478 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jgf62" podUID="e15ee7f9-dfd3-4121-89e2-a4eefd35413e" Apr 17 16:52:51.634123 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:51.634086 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:51.634313 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:51.634086 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:52:51.634313 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:51.634222 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcjnr" podUID="1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3" Apr 17 16:52:51.634427 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:51.634319 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-blsng" podUID="33d21ed2-8e33-49bf-a161-a1a1a93a72d8" Apr 17 16:52:52.634540 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:52.634495 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:52.634980 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:52.634625 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jgf62" podUID="e15ee7f9-dfd3-4121-89e2-a4eefd35413e" Apr 17 16:52:53.349736 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:53.349707 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs\") pod \"network-metrics-daemon-bcjnr\" (UID: \"1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3\") " pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:53.349965 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:53.349816 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:53.349965 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:53.349873 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs podName:1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:09.349857543 +0000 UTC m=+34.256742342 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs") pod "network-metrics-daemon-bcjnr" (UID: "1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:53.451055 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:53.451020 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lz7r\" (UniqueName: \"kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r\") pod \"network-check-target-blsng\" (UID: \"33d21ed2-8e33-49bf-a161-a1a1a93a72d8\") " pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:52:53.451226 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:53.451168 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:52:53.451226 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:53.451186 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:52:53.451226 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:53.451195 2568 projected.go:194] Error preparing data for projected volume kube-api-access-5lz7r for pod openshift-network-diagnostics/network-check-target-blsng: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:53.451358 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:53.451244 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r podName:33d21ed2-8e33-49bf-a161-a1a1a93a72d8 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:09.451230161 +0000 UTC m=+34.358114955 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-5lz7r" (UniqueName: "kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r") pod "network-check-target-blsng" (UID: "33d21ed2-8e33-49bf-a161-a1a1a93a72d8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:53.634220 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:53.634140 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:52:53.634360 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:53.634271 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-blsng" podUID="33d21ed2-8e33-49bf-a161-a1a1a93a72d8" Apr 17 16:52:53.634360 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:53.634352 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:53.634478 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:53.634461 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcjnr" podUID="1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3" Apr 17 16:52:54.633841 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:54.633808 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:54.634274 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:54.633914 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jgf62" podUID="e15ee7f9-dfd3-4121-89e2-a4eefd35413e" Apr 17 16:52:55.634971 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.634578 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:55.635640 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.634628 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:52:55.635640 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:55.635081 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcjnr" podUID="1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3" Apr 17 16:52:55.635640 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:55.635167 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-blsng" podUID="33d21ed2-8e33-49bf-a161-a1a1a93a72d8" Apr 17 16:52:55.735270 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.735236 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" event={"ID":"df31caad-3c8d-4eb9-88a9-601343e39692","Type":"ContainerStarted","Data":"3dda88d64089fb4b43893d9ded3002f8c03c08615fc19157911653fc39f67358"} Apr 17 16:52:55.736625 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.736596 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h5kgg" event={"ID":"747af7cf-1df7-4cfd-8bb2-841945c9fd3e","Type":"ContainerStarted","Data":"f538b1b02cd231a88f2ac774b28c675f5594e46624b070b605cff1c489456d71"} Apr 17 16:52:55.738928 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.738901 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q5p9p_f461b687-3271-484f-a873-6a5fb0b1214d/ovn-acl-logging/0.log" Apr 17 16:52:55.739290 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.739251 2568 generic.go:358] "Generic (PLEG): container finished" podID="f461b687-3271-484f-a873-6a5fb0b1214d" containerID="424edb61a6ac06e895f8869ae658cbd70749955c464defe75aedf20e7a87ab49" exitCode=1 Apr 17 16:52:55.739388 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.739340 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" event={"ID":"f461b687-3271-484f-a873-6a5fb0b1214d","Type":"ContainerStarted","Data":"2d9072c01d6e26fb72e32733f5ff5c729cf9207df877abfa14b1c8c5aba39acb"} Apr 17 16:52:55.739388 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.739372 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" event={"ID":"f461b687-3271-484f-a873-6a5fb0b1214d","Type":"ContainerStarted","Data":"85ad0e939d8bbe1171fd01f65cf47967eb44d16a4354d89d9899305e7d617023"} Apr 17 16:52:55.739388 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.739385 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" event={"ID":"f461b687-3271-484f-a873-6a5fb0b1214d","Type":"ContainerStarted","Data":"8c21416d3f472fd00d08e2d1012a7dbc534e3cb97ee36692eab41d1c991922b4"} Apr 17 16:52:55.739546 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.739399 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" event={"ID":"f461b687-3271-484f-a873-6a5fb0b1214d","Type":"ContainerDied","Data":"424edb61a6ac06e895f8869ae658cbd70749955c464defe75aedf20e7a87ab49"} Apr 17 16:52:55.739546 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.739414 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" event={"ID":"f461b687-3271-484f-a873-6a5fb0b1214d","Type":"ContainerStarted","Data":"08d84aa4b93d20e75ecd0096c1b8b224e98d4163b872499ddb34770f6448de5c"} Apr 17 16:52:55.740711 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.740686 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xwgv2" event={"ID":"bcabc9cc-357b-429a-80c7-605b1281122f","Type":"ContainerStarted","Data":"71a00ef7cac90668fdc6cc6838d4e3ebfb42aa2df2bdebc11023c1a0fe31fe9e"} Apr 17 16:52:55.742081 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.742059 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ndxqw" event={"ID":"66341dd8-b441-446a-be14-71280c6960b2","Type":"ContainerStarted","Data":"b3e914f7d21e12d5b869ef4380f7577dd31f9fe9c805b0d744c9bbd4bb2db202"} Apr 17 16:52:55.743345 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.743322 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" event={"ID":"0e1ddd0a-ef10-44ae-a5c9-d1857ba5dd79","Type":"ContainerStarted","Data":"ba46b650bbb6e628ee856c0ab4416144f3ec8d9bca15ee48292a37b6c9b29f4e"} Apr 17 16:52:55.744820 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.744800 2568 generic.go:358] "Generic (PLEG): container finished" podID="561a91fc-d084-4461-b5bf-ad5bc1ac7a9e" containerID="f4c774bb2ea1593a600463e432b9cc58fe91622de0ee1eb91f2219e716f33641" exitCode=0 Apr 17 16:52:55.744897 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.744861 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ct6wx" event={"ID":"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e","Type":"ContainerDied","Data":"f4c774bb2ea1593a600463e432b9cc58fe91622de0ee1eb91f2219e716f33641"} Apr 17 16:52:55.746191 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.746170 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pw92n" event={"ID":"205623d6-59f4-4e27-8196-83daaf7a9d26","Type":"ContainerStarted","Data":"f782537e69e7db49d9600ee4bf459a6d2675d89bfdc471ac355f5519a89c8441"} Apr 17 16:52:55.752808 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.752770 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-h5kgg" podStartSLOduration=4.08476069 podStartE2EDuration="20.752760762s" podCreationTimestamp="2026-04-17 16:52:35 +0000 UTC" firstStartedPulling="2026-04-17 16:52:38.257051962 +0000 UTC m=+3.163936771" lastFinishedPulling="2026-04-17 16:52:54.925052046 +0000 UTC m=+19.831936843" observedRunningTime="2026-04-17 16:52:55.752688625 +0000 UTC m=+20.659573441" watchObservedRunningTime="2026-04-17 16:52:55.752760762 +0000 UTC m=+20.659645577" Apr 17 16:52:55.770515 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.770480 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pw92n" podStartSLOduration=4.094334781 podStartE2EDuration="20.770471007s" podCreationTimestamp="2026-04-17 16:52:35 +0000 UTC" firstStartedPulling="2026-04-17 16:52:38.248890218 +0000 UTC m=+3.155775015" lastFinishedPulling="2026-04-17 16:52:54.925026432 +0000 UTC m=+19.831911241" observedRunningTime="2026-04-17 16:52:55.769851174 +0000 UTC m=+20.676735992" watchObservedRunningTime="2026-04-17 16:52:55.770471007 +0000 UTC m=+20.677355823" Apr 17 16:52:55.784415 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.784384 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ndxqw" podStartSLOduration=11.974678871 podStartE2EDuration="20.7843748s" podCreationTimestamp="2026-04-17 16:52:35 +0000 UTC" firstStartedPulling="2026-04-17 16:52:38.255361044 +0000 UTC m=+3.162245856" lastFinishedPulling="2026-04-17 16:52:47.065056976 +0000 UTC m=+11.971941785" observedRunningTime="2026-04-17 16:52:55.783954309 +0000 UTC m=+20.690839125" watchObservedRunningTime="2026-04-17 16:52:55.7843748 +0000 UTC m=+20.691259616" Apr 17 16:52:55.804893 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.804851 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xwgv2" podStartSLOduration=3.9842057520000003 podStartE2EDuration="20.804834207s" podCreationTimestamp="2026-04-17 16:52:35 +0000 UTC" firstStartedPulling="2026-04-17 16:52:38.253769493 +0000 UTC m=+3.160654290" lastFinishedPulling="2026-04-17 16:52:55.074397949 +0000 UTC m=+19.981282745" observedRunningTime="2026-04-17 16:52:55.804315972 +0000 UTC m=+20.711200788" watchObservedRunningTime="2026-04-17 16:52:55.804834207 +0000 UTC m=+20.711719023" Apr 17 16:52:55.856080 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:55.855175 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-tvkbq" podStartSLOduration=4.01120657 podStartE2EDuration="20.855159315s" podCreationTimestamp="2026-04-17 16:52:35 +0000 UTC" firstStartedPulling="2026-04-17 16:52:38.25187977 +0000 UTC m=+3.158764577" lastFinishedPulling="2026-04-17 16:52:55.095832517 +0000 UTC m=+20.002717322" observedRunningTime="2026-04-17 16:52:55.854477808 +0000 UTC m=+20.761362624" watchObservedRunningTime="2026-04-17 16:52:55.855159315 +0000 UTC m=+20.762044133" Apr 17 16:52:56.024379 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:56.024200 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-h5kgg" Apr 17 16:52:56.024912 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:56.024893 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-h5kgg" Apr 17 16:52:56.166869 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:56.166845 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 16:52:56.577164 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:56.577056 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T16:52:56.166865757Z","UUID":"ee0dfba1-4f71-4e77-83e6-d8596ea98a40","Handler":null,"Name":"","Endpoint":""} Apr 17 16:52:56.579209 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:56.579139 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 16:52:56.579209 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:56.579168 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 16:52:56.634371 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:56.634346 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:56.634516 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:56.634466 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jgf62" podUID="e15ee7f9-dfd3-4121-89e2-a4eefd35413e" Apr 17 16:52:56.750378 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:56.750344 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" event={"ID":"df31caad-3c8d-4eb9-88a9-601343e39692","Type":"ContainerStarted","Data":"2d43c625d7aee46f4305dde760bbbf6e8e17ce7d3a132c4703d510a6f96887c2"} Apr 17 16:52:56.753058 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:56.753034 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q5p9p_f461b687-3271-484f-a873-6a5fb0b1214d/ovn-acl-logging/0.log" Apr 17 16:52:56.753411 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:56.753385 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" event={"ID":"f461b687-3271-484f-a873-6a5fb0b1214d","Type":"ContainerStarted","Data":"d2327beb0d77d68df285eff533444d40de21be5eae2c452d1961af3cfcc05c69"} Apr 17 16:52:56.756253 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:56.756121 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9gh9v" event={"ID":"4ee63b4a-1e22-4b4f-a9a3-c0254922b28f","Type":"ContainerStarted","Data":"9f66386aea37e04d3e02048c52d4b674b7402cb43bddd8d843415d383aa56eeb"} Apr 17 16:52:56.756880 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:56.756851 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-h5kgg" Apr 17 16:52:56.757128 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:56.757111 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-h5kgg" Apr 17 16:52:56.771356 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:56.771300 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-9gh9v" podStartSLOduration=5.129706128 podStartE2EDuration="21.771288355s" podCreationTimestamp="2026-04-17 16:52:35 +0000 UTC" firstStartedPulling="2026-04-17 16:52:38.252981561 +0000 UTC m=+3.159866358" lastFinishedPulling="2026-04-17 16:52:54.894563775 +0000 UTC m=+19.801448585" observedRunningTime="2026-04-17 16:52:56.770487196 +0000 UTC m=+21.677372013" watchObservedRunningTime="2026-04-17 16:52:56.771288355 +0000 UTC m=+21.678173171" Apr 17 16:52:57.179433 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:57.179391 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-original-pull-secret\") pod \"global-pull-secret-syncer-jgf62\" (UID: \"e15ee7f9-dfd3-4121-89e2-a4eefd35413e\") " pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:57.179607 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:57.179542 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:57.179650 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:57.179615 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-original-pull-secret podName:e15ee7f9-dfd3-4121-89e2-a4eefd35413e nodeName:}" failed. No retries permitted until 2026-04-17 16:53:13.179594658 +0000 UTC m=+38.086479454 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-original-pull-secret") pod "global-pull-secret-syncer-jgf62" (UID: "e15ee7f9-dfd3-4121-89e2-a4eefd35413e") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:52:57.633847 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:57.633816 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:57.634060 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:57.633962 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcjnr" podUID="1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3" Apr 17 16:52:57.634060 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:57.634025 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:52:57.634146 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:57.634113 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-blsng" podUID="33d21ed2-8e33-49bf-a161-a1a1a93a72d8" Apr 17 16:52:57.758674 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:57.758631 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" event={"ID":"df31caad-3c8d-4eb9-88a9-601343e39692","Type":"ContainerStarted","Data":"88ac73f2a46c084313db93137ef3bba7a55e092094db2f1fdd1b35ec0709e1ab"} Apr 17 16:52:57.781766 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:57.781719 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9w74t" podStartSLOduration=3.845948144 podStartE2EDuration="22.781703093s" podCreationTimestamp="2026-04-17 16:52:35 +0000 UTC" firstStartedPulling="2026-04-17 16:52:38.258078631 +0000 UTC m=+3.164963425" lastFinishedPulling="2026-04-17 16:52:57.193833566 +0000 UTC m=+22.100718374" observedRunningTime="2026-04-17 16:52:57.780155749 +0000 UTC m=+22.687040565" watchObservedRunningTime="2026-04-17 16:52:57.781703093 +0000 UTC m=+22.688587911" Apr 17 16:52:58.634339 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:58.634313 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:52:58.634490 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:58.634416 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jgf62" podUID="e15ee7f9-dfd3-4121-89e2-a4eefd35413e" Apr 17 16:52:58.763187 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:58.763162 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q5p9p_f461b687-3271-484f-a873-6a5fb0b1214d/ovn-acl-logging/0.log" Apr 17 16:52:58.763748 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:58.763497 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" event={"ID":"f461b687-3271-484f-a873-6a5fb0b1214d","Type":"ContainerStarted","Data":"c53df7929d13090d2a5b304c0c255f0fa7894328d7685c814e24438b90da0bad"} Apr 17 16:52:59.634398 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:59.634366 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:52:59.634562 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:59.634477 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcjnr" podUID="1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3" Apr 17 16:52:59.634562 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:52:59.634534 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:52:59.634664 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:52:59.634583 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-blsng" podUID="33d21ed2-8e33-49bf-a161-a1a1a93a72d8" Apr 17 16:53:00.634151 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:00.633991 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:53:00.634638 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:00.634219 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jgf62" podUID="e15ee7f9-dfd3-4121-89e2-a4eefd35413e" Apr 17 16:53:00.769936 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:00.769907 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q5p9p_f461b687-3271-484f-a873-6a5fb0b1214d/ovn-acl-logging/0.log" Apr 17 16:53:00.770269 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:00.770248 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" event={"ID":"f461b687-3271-484f-a873-6a5fb0b1214d","Type":"ContainerStarted","Data":"e09419330e2714c08f330668bfe6c8e2a81e699308ec80a5960c2e479ae8ce90"} Apr 17 16:53:00.770570 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:00.770539 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:53:00.770712 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:00.770695 2568 scope.go:117] "RemoveContainer" containerID="424edb61a6ac06e895f8869ae658cbd70749955c464defe75aedf20e7a87ab49" Apr 17 16:53:00.772059 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:00.772036 2568 generic.go:358] "Generic (PLEG): container finished" podID="561a91fc-d084-4461-b5bf-ad5bc1ac7a9e" containerID="9bf7fc7ab7a687b82a855c9999d2fa866063f056484ba6a369d16fd57bbbb17d" exitCode=0 Apr 17 16:53:00.772149 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:00.772068 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ct6wx" event={"ID":"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e","Type":"ContainerDied","Data":"9bf7fc7ab7a687b82a855c9999d2fa866063f056484ba6a369d16fd57bbbb17d"} Apr 17 16:53:00.787384 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:00.787362 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:53:01.634219 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:01.634191 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:53:01.634554 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:01.634293 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-blsng" podUID="33d21ed2-8e33-49bf-a161-a1a1a93a72d8" Apr 17 16:53:01.634554 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:01.634386 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:53:01.634554 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:01.634523 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcjnr" podUID="1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3" Apr 17 16:53:01.777821 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:01.777658 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q5p9p_f461b687-3271-484f-a873-6a5fb0b1214d/ovn-acl-logging/0.log" Apr 17 16:53:01.778189 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:01.778162 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" event={"ID":"f461b687-3271-484f-a873-6a5fb0b1214d","Type":"ContainerStarted","Data":"0e3f6b50144ce9b75ee7d686cc10bb4a3e6042c4f4e064393cc508edf9e80ae7"} Apr 17 16:53:01.778303 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:01.778288 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 16:53:01.778557 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:01.778520 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:53:01.780307 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:01.780279 2568 generic.go:358] "Generic (PLEG): container finished" podID="561a91fc-d084-4461-b5bf-ad5bc1ac7a9e" containerID="ff9672d33c75db058dabe299cba5fcefd84a2f6af4eb9b85155423f8693dfcf0" exitCode=0 Apr 17 16:53:01.780418 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:01.780322 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ct6wx" event={"ID":"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e","Type":"ContainerDied","Data":"ff9672d33c75db058dabe299cba5fcefd84a2f6af4eb9b85155423f8693dfcf0"} Apr 17 16:53:01.793016 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:01.792999 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:53:01.819486 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:01.819444 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" podStartSLOduration=9.920098627 podStartE2EDuration="26.819432476s" podCreationTimestamp="2026-04-17 16:52:35 +0000 UTC" firstStartedPulling="2026-04-17 16:52:38.256422434 +0000 UTC m=+3.163307231" lastFinishedPulling="2026-04-17 16:52:55.15575628 +0000 UTC m=+20.062641080" observedRunningTime="2026-04-17 16:53:01.816441662 +0000 UTC m=+26.723326479" watchObservedRunningTime="2026-04-17 16:53:01.819432476 +0000 UTC m=+26.726317292" Apr 17 16:53:02.126854 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:02.126624 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-blsng"] Apr 17 16:53:02.126854 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:02.126770 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:53:02.127347 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:02.126890 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-blsng" podUID="33d21ed2-8e33-49bf-a161-a1a1a93a72d8" Apr 17 16:53:02.129662 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:02.129432 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bcjnr"] Apr 17 16:53:02.129662 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:02.129539 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:53:02.129662 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:02.129641 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcjnr" podUID="1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3" Apr 17 16:53:02.130104 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:02.130084 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jgf62"] Apr 17 16:53:02.130210 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:02.130192 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:53:02.130323 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:02.130278 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jgf62" podUID="e15ee7f9-dfd3-4121-89e2-a4eefd35413e" Apr 17 16:53:02.783999 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:02.783969 2568 generic.go:358] "Generic (PLEG): container finished" podID="561a91fc-d084-4461-b5bf-ad5bc1ac7a9e" containerID="7d8e54b7ae01a2e80fdd6182168322929ea56c01ae5aafb858a961ba78ecf48b" exitCode=0 Apr 17 16:53:02.784619 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:02.784052 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ct6wx" event={"ID":"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e","Type":"ContainerDied","Data":"7d8e54b7ae01a2e80fdd6182168322929ea56c01ae5aafb858a961ba78ecf48b"} Apr 17 16:53:02.784619 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:02.784218 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 16:53:03.634592 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:03.634557 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:53:03.634776 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:03.634674 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:53:03.634776 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:03.634685 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-blsng" podUID="33d21ed2-8e33-49bf-a161-a1a1a93a72d8" Apr 17 16:53:03.634903 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:03.634857 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:53:03.634980 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:03.634854 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcjnr" podUID="1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3" Apr 17 16:53:03.635033 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:03.634972 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jgf62" podUID="e15ee7f9-dfd3-4121-89e2-a4eefd35413e" Apr 17 16:53:03.785976 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:03.785941 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 16:53:05.635316 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:05.635276 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:53:05.635852 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:05.635378 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:53:05.635852 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:05.635424 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:53:05.635852 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:05.635466 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-blsng" podUID="33d21ed2-8e33-49bf-a161-a1a1a93a72d8" Apr 17 16:53:05.635852 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:05.635509 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcjnr" podUID="1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3" Apr 17 16:53:05.635852 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:05.635598 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jgf62" podUID="e15ee7f9-dfd3-4121-89e2-a4eefd35413e" Apr 17 16:53:06.528197 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:06.528162 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:53:06.528421 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:06.528405 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 16:53:06.539781 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:06.539755 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q5p9p" Apr 17 16:53:07.634173 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:07.634123 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:53:07.634173 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:07.634153 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:53:07.634674 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:07.634242 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-blsng" podUID="33d21ed2-8e33-49bf-a161-a1a1a93a72d8" Apr 17 16:53:07.634674 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:07.634275 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:53:07.634674 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:07.634371 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jgf62" podUID="e15ee7f9-dfd3-4121-89e2-a4eefd35413e" Apr 17 16:53:07.634674 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:07.634507 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcjnr" podUID="1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3" Apr 17 16:53:07.928057 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:07.927823 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-88.ec2.internal" event="NodeReady" Apr 17 16:53:07.928281 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:07.928137 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 16:53:07.978223 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:07.978196 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bc8nx"] Apr 17 16:53:07.982851 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:07.982829 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-k7gjh"] Apr 17 16:53:07.983018 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:07.983001 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:07.985376 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:07.985339 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k7gjh" Apr 17 16:53:07.985486 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:07.985402 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 16:53:07.985486 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:07.985447 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 16:53:07.985717 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:07.985700 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8mbmc\"" Apr 17 16:53:07.989505 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:07.989442 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 16:53:07.989505 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:07.989448 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 16:53:07.989724 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:07.989706 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bkp6b\"" Apr 17 16:53:07.991322 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:07.991180 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 16:53:07.993305 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:07.993264 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bc8nx"] Apr 17 16:53:07.994087 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:07.994050 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k7gjh"] Apr 17 16:53:08.168375 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:08.168346 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/707c1483-2d89-4157-80d6-4356800a454b-cert\") pod \"ingress-canary-k7gjh\" (UID: \"707c1483-2d89-4157-80d6-4356800a454b\") " pod="openshift-ingress-canary/ingress-canary-k7gjh" Apr 17 16:53:08.168375 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:08.168382 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdr7g\" (UniqueName: \"kubernetes.io/projected/b9cacbec-64af-43d7-85d4-fde767a1cfa3-kube-api-access-fdr7g\") pod \"dns-default-bc8nx\" (UID: \"b9cacbec-64af-43d7-85d4-fde767a1cfa3\") " pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:08.168568 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:08.168426 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b9cacbec-64af-43d7-85d4-fde767a1cfa3-tmp-dir\") pod \"dns-default-bc8nx\" (UID: \"b9cacbec-64af-43d7-85d4-fde767a1cfa3\") " pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:08.168568 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:08.168480 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9cacbec-64af-43d7-85d4-fde767a1cfa3-config-volume\") pod \"dns-default-bc8nx\" (UID: \"b9cacbec-64af-43d7-85d4-fde767a1cfa3\") " pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:08.168568 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:08.168505 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9cacbec-64af-43d7-85d4-fde767a1cfa3-metrics-tls\") pod \"dns-default-bc8nx\" (UID: \"b9cacbec-64af-43d7-85d4-fde767a1cfa3\") " pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:08.168660 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:08.168579 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk6z4\" (UniqueName: \"kubernetes.io/projected/707c1483-2d89-4157-80d6-4356800a454b-kube-api-access-zk6z4\") pod \"ingress-canary-k7gjh\" (UID: \"707c1483-2d89-4157-80d6-4356800a454b\") " pod="openshift-ingress-canary/ingress-canary-k7gjh" Apr 17 16:53:08.269665 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:08.269638 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdr7g\" (UniqueName: \"kubernetes.io/projected/b9cacbec-64af-43d7-85d4-fde767a1cfa3-kube-api-access-fdr7g\") pod \"dns-default-bc8nx\" (UID: \"b9cacbec-64af-43d7-85d4-fde767a1cfa3\") " pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:08.269796 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:08.269685 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b9cacbec-64af-43d7-85d4-fde767a1cfa3-tmp-dir\") pod \"dns-default-bc8nx\" (UID: \"b9cacbec-64af-43d7-85d4-fde767a1cfa3\") " pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:08.269796 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:08.269714 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9cacbec-64af-43d7-85d4-fde767a1cfa3-config-volume\") pod \"dns-default-bc8nx\" (UID: \"b9cacbec-64af-43d7-85d4-fde767a1cfa3\") " pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:08.269867 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:08.269819 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9cacbec-64af-43d7-85d4-fde767a1cfa3-metrics-tls\") pod \"dns-default-bc8nx\" (UID: \"b9cacbec-64af-43d7-85d4-fde767a1cfa3\") " pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:08.269915 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:08.269868 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zk6z4\" (UniqueName: \"kubernetes.io/projected/707c1483-2d89-4157-80d6-4356800a454b-kube-api-access-zk6z4\") pod \"ingress-canary-k7gjh\" (UID: \"707c1483-2d89-4157-80d6-4356800a454b\") " pod="openshift-ingress-canary/ingress-canary-k7gjh" Apr 17 16:53:08.269915 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:08.269890 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/707c1483-2d89-4157-80d6-4356800a454b-cert\") pod \"ingress-canary-k7gjh\" (UID: \"707c1483-2d89-4157-80d6-4356800a454b\") " pod="openshift-ingress-canary/ingress-canary-k7gjh" Apr 17 16:53:08.270039 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:08.270008 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:53:08.270039 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:08.270019 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:53:08.270132 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:08.270061 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b9cacbec-64af-43d7-85d4-fde767a1cfa3-tmp-dir\") pod \"dns-default-bc8nx\" (UID: \"b9cacbec-64af-43d7-85d4-fde767a1cfa3\") " pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:08.270132 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:08.270070 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/707c1483-2d89-4157-80d6-4356800a454b-cert podName:707c1483-2d89-4157-80d6-4356800a454b nodeName:}" failed. No retries permitted until 2026-04-17 16:53:08.770052522 +0000 UTC m=+33.676937320 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/707c1483-2d89-4157-80d6-4356800a454b-cert") pod "ingress-canary-k7gjh" (UID: "707c1483-2d89-4157-80d6-4356800a454b") : secret "canary-serving-cert" not found Apr 17 16:53:08.270132 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:08.270118 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9cacbec-64af-43d7-85d4-fde767a1cfa3-metrics-tls podName:b9cacbec-64af-43d7-85d4-fde767a1cfa3 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:08.770099395 +0000 UTC m=+33.676984197 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b9cacbec-64af-43d7-85d4-fde767a1cfa3-metrics-tls") pod "dns-default-bc8nx" (UID: "b9cacbec-64af-43d7-85d4-fde767a1cfa3") : secret "dns-default-metrics-tls" not found Apr 17 16:53:08.270300 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:08.270243 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9cacbec-64af-43d7-85d4-fde767a1cfa3-config-volume\") pod \"dns-default-bc8nx\" (UID: \"b9cacbec-64af-43d7-85d4-fde767a1cfa3\") " pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:08.280169 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:08.280147 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk6z4\" (UniqueName: \"kubernetes.io/projected/707c1483-2d89-4157-80d6-4356800a454b-kube-api-access-zk6z4\") pod \"ingress-canary-k7gjh\" (UID: \"707c1483-2d89-4157-80d6-4356800a454b\") " pod="openshift-ingress-canary/ingress-canary-k7gjh" Apr 17 16:53:08.280275 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:08.280174 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdr7g\" (UniqueName: \"kubernetes.io/projected/b9cacbec-64af-43d7-85d4-fde767a1cfa3-kube-api-access-fdr7g\") pod \"dns-default-bc8nx\" (UID: \"b9cacbec-64af-43d7-85d4-fde767a1cfa3\") " pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:08.773328 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:08.773266 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9cacbec-64af-43d7-85d4-fde767a1cfa3-metrics-tls\") pod \"dns-default-bc8nx\" (UID: \"b9cacbec-64af-43d7-85d4-fde767a1cfa3\") " pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:08.773618 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:08.773370 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/707c1483-2d89-4157-80d6-4356800a454b-cert\") pod \"ingress-canary-k7gjh\" (UID: \"707c1483-2d89-4157-80d6-4356800a454b\") " pod="openshift-ingress-canary/ingress-canary-k7gjh" Apr 17 16:53:08.773618 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:08.773403 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:53:08.773618 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:08.773458 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:53:08.773618 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:08.773462 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9cacbec-64af-43d7-85d4-fde767a1cfa3-metrics-tls podName:b9cacbec-64af-43d7-85d4-fde767a1cfa3 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:09.773444244 +0000 UTC m=+34.680329040 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b9cacbec-64af-43d7-85d4-fde767a1cfa3-metrics-tls") pod "dns-default-bc8nx" (UID: "b9cacbec-64af-43d7-85d4-fde767a1cfa3") : secret "dns-default-metrics-tls" not found Apr 17 16:53:08.773618 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:08.773491 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/707c1483-2d89-4157-80d6-4356800a454b-cert podName:707c1483-2d89-4157-80d6-4356800a454b nodeName:}" failed. No retries permitted until 2026-04-17 16:53:09.773480889 +0000 UTC m=+34.680365687 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/707c1483-2d89-4157-80d6-4356800a454b-cert") pod "ingress-canary-k7gjh" (UID: "707c1483-2d89-4157-80d6-4356800a454b") : secret "canary-serving-cert" not found Apr 17 16:53:08.799637 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:08.799608 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ct6wx" event={"ID":"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e","Type":"ContainerStarted","Data":"0a2097cf592cccf29ea310eca3de2a074438c3955cb0a8c9051d8262ea7d1352"} Apr 17 16:53:09.377826 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:09.377794 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs\") pod \"network-metrics-daemon-bcjnr\" (UID: \"1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3\") " pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:53:09.378015 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:09.377949 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:53:09.378015 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:09.378010 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs podName:1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:41.377992591 +0000 UTC m=+66.284877397 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs") pod "network-metrics-daemon-bcjnr" (UID: "1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:53:09.478509 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:09.478477 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lz7r\" (UniqueName: \"kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r\") pod \"network-check-target-blsng\" (UID: \"33d21ed2-8e33-49bf-a161-a1a1a93a72d8\") " pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:53:09.478650 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:09.478630 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:53:09.478691 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:09.478654 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:53:09.478691 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:09.478665 2568 projected.go:194] Error preparing data for projected volume kube-api-access-5lz7r for pod openshift-network-diagnostics/network-check-target-blsng: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:53:09.478750 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:09.478712 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r podName:33d21ed2-8e33-49bf-a161-a1a1a93a72d8 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:41.478698259 +0000 UTC m=+66.385583053 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-5lz7r" (UniqueName: "kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r") pod "network-check-target-blsng" (UID: "33d21ed2-8e33-49bf-a161-a1a1a93a72d8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:53:09.634655 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:09.634574 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:53:09.634796 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:09.634573 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:53:09.634796 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:09.634574 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:53:09.639380 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:09.639308 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:53:09.639380 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:09.639325 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:53:09.639380 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:09.639335 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bkwcz\"" Apr 17 16:53:09.639380 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:09.639327 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 16:53:09.639670 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:09.639315 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:53:09.639670 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:09.639315 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wm7r2\"" Apr 17 16:53:09.780160 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:09.780130 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/707c1483-2d89-4157-80d6-4356800a454b-cert\") pod \"ingress-canary-k7gjh\" (UID: \"707c1483-2d89-4157-80d6-4356800a454b\") " pod="openshift-ingress-canary/ingress-canary-k7gjh" Apr 17 16:53:09.780534 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:09.780198 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9cacbec-64af-43d7-85d4-fde767a1cfa3-metrics-tls\") pod \"dns-default-bc8nx\" (UID: \"b9cacbec-64af-43d7-85d4-fde767a1cfa3\") " pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:09.780534 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:09.780283 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:53:09.780534 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:09.780286 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:53:09.780534 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:09.780339 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9cacbec-64af-43d7-85d4-fde767a1cfa3-metrics-tls podName:b9cacbec-64af-43d7-85d4-fde767a1cfa3 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:11.780321655 +0000 UTC m=+36.687206468 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b9cacbec-64af-43d7-85d4-fde767a1cfa3-metrics-tls") pod "dns-default-bc8nx" (UID: "b9cacbec-64af-43d7-85d4-fde767a1cfa3") : secret "dns-default-metrics-tls" not found Apr 17 16:53:09.780534 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:09.780352 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/707c1483-2d89-4157-80d6-4356800a454b-cert podName:707c1483-2d89-4157-80d6-4356800a454b nodeName:}" failed. No retries permitted until 2026-04-17 16:53:11.780346998 +0000 UTC m=+36.687231792 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/707c1483-2d89-4157-80d6-4356800a454b-cert") pod "ingress-canary-k7gjh" (UID: "707c1483-2d89-4157-80d6-4356800a454b") : secret "canary-serving-cert" not found Apr 17 16:53:09.803363 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:09.803341 2568 generic.go:358] "Generic (PLEG): container finished" podID="561a91fc-d084-4461-b5bf-ad5bc1ac7a9e" containerID="0a2097cf592cccf29ea310eca3de2a074438c3955cb0a8c9051d8262ea7d1352" exitCode=0 Apr 17 16:53:09.803484 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:09.803374 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ct6wx" event={"ID":"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e","Type":"ContainerDied","Data":"0a2097cf592cccf29ea310eca3de2a074438c3955cb0a8c9051d8262ea7d1352"} Apr 17 16:53:10.807720 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:10.807690 2568 generic.go:358] "Generic (PLEG): container finished" podID="561a91fc-d084-4461-b5bf-ad5bc1ac7a9e" containerID="318804296bf7a05b6017799c84c395f7347f0ac1c9e78dd5e2b3d4127083a870" exitCode=0 Apr 17 16:53:10.808091 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:10.807751 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ct6wx" event={"ID":"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e","Type":"ContainerDied","Data":"318804296bf7a05b6017799c84c395f7347f0ac1c9e78dd5e2b3d4127083a870"} Apr 17 16:53:11.795395 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:11.795361 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9cacbec-64af-43d7-85d4-fde767a1cfa3-metrics-tls\") pod \"dns-default-bc8nx\" (UID: \"b9cacbec-64af-43d7-85d4-fde767a1cfa3\") " pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:11.795561 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:11.795423 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/707c1483-2d89-4157-80d6-4356800a454b-cert\") pod \"ingress-canary-k7gjh\" (UID: \"707c1483-2d89-4157-80d6-4356800a454b\") " pod="openshift-ingress-canary/ingress-canary-k7gjh" Apr 17 16:53:11.795561 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:11.795511 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:53:11.795670 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:11.795570 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9cacbec-64af-43d7-85d4-fde767a1cfa3-metrics-tls podName:b9cacbec-64af-43d7-85d4-fde767a1cfa3 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:15.795555435 +0000 UTC m=+40.702440229 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b9cacbec-64af-43d7-85d4-fde767a1cfa3-metrics-tls") pod "dns-default-bc8nx" (UID: "b9cacbec-64af-43d7-85d4-fde767a1cfa3") : secret "dns-default-metrics-tls" not found Apr 17 16:53:11.795670 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:11.795513 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:53:11.795670 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:11.795656 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/707c1483-2d89-4157-80d6-4356800a454b-cert podName:707c1483-2d89-4157-80d6-4356800a454b nodeName:}" failed. No retries permitted until 2026-04-17 16:53:15.79563873 +0000 UTC m=+40.702523548 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/707c1483-2d89-4157-80d6-4356800a454b-cert") pod "ingress-canary-k7gjh" (UID: "707c1483-2d89-4157-80d6-4356800a454b") : secret "canary-serving-cert" not found Apr 17 16:53:11.812819 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:11.812598 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ct6wx" event={"ID":"561a91fc-d084-4461-b5bf-ad5bc1ac7a9e","Type":"ContainerStarted","Data":"c32edb27d64d4eb4d2010717473c0a56aec2df487a8d53b85f30d6becf7cc9f8"} Apr 17 16:53:11.840017 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:11.839976 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ct6wx" podStartSLOduration=6.530694218 podStartE2EDuration="36.839959896s" podCreationTimestamp="2026-04-17 16:52:35 +0000 UTC" firstStartedPulling="2026-04-17 16:52:38.250154192 +0000 UTC m=+3.157038987" lastFinishedPulling="2026-04-17 16:53:08.55941986 +0000 UTC m=+33.466304665" observedRunningTime="2026-04-17 16:53:11.836278508 +0000 UTC m=+36.743163324" watchObservedRunningTime="2026-04-17 16:53:11.839959896 +0000 UTC m=+36.746844715" Apr 17 16:53:13.207798 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:13.207750 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-original-pull-secret\") pod \"global-pull-secret-syncer-jgf62\" (UID: \"e15ee7f9-dfd3-4121-89e2-a4eefd35413e\") " pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:53:13.211168 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:13.211145 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e15ee7f9-dfd3-4121-89e2-a4eefd35413e-original-pull-secret\") pod \"global-pull-secret-syncer-jgf62\" (UID: \"e15ee7f9-dfd3-4121-89e2-a4eefd35413e\") " pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:53:13.246160 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:13.246133 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jgf62" Apr 17 16:53:13.370339 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:13.370310 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jgf62"] Apr 17 16:53:13.373747 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:53:13.373723 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode15ee7f9_dfd3_4121_89e2_a4eefd35413e.slice/crio-df5f1437e640d1967f73636cfe2f4546ca7ef9ed63ad145e248d4989c9747594 WatchSource:0}: Error finding container df5f1437e640d1967f73636cfe2f4546ca7ef9ed63ad145e248d4989c9747594: Status 404 returned error can't find the container with id df5f1437e640d1967f73636cfe2f4546ca7ef9ed63ad145e248d4989c9747594 Apr 17 16:53:13.817397 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:13.817361 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jgf62" event={"ID":"e15ee7f9-dfd3-4121-89e2-a4eefd35413e","Type":"ContainerStarted","Data":"df5f1437e640d1967f73636cfe2f4546ca7ef9ed63ad145e248d4989c9747594"} Apr 17 16:53:15.829545 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:15.829508 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9cacbec-64af-43d7-85d4-fde767a1cfa3-metrics-tls\") pod \"dns-default-bc8nx\" (UID: \"b9cacbec-64af-43d7-85d4-fde767a1cfa3\") " pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:15.830276 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:15.829577 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/707c1483-2d89-4157-80d6-4356800a454b-cert\") pod \"ingress-canary-k7gjh\" (UID: \"707c1483-2d89-4157-80d6-4356800a454b\") " pod="openshift-ingress-canary/ingress-canary-k7gjh" Apr 17 16:53:15.830276 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:15.829663 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:53:15.830276 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:15.829678 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:53:15.830276 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:15.829729 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9cacbec-64af-43d7-85d4-fde767a1cfa3-metrics-tls podName:b9cacbec-64af-43d7-85d4-fde767a1cfa3 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:23.829710345 +0000 UTC m=+48.736595139 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b9cacbec-64af-43d7-85d4-fde767a1cfa3-metrics-tls") pod "dns-default-bc8nx" (UID: "b9cacbec-64af-43d7-85d4-fde767a1cfa3") : secret "dns-default-metrics-tls" not found Apr 17 16:53:15.830276 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:15.829749 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/707c1483-2d89-4157-80d6-4356800a454b-cert podName:707c1483-2d89-4157-80d6-4356800a454b nodeName:}" failed. No retries permitted until 2026-04-17 16:53:23.829742213 +0000 UTC m=+48.736627008 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/707c1483-2d89-4157-80d6-4356800a454b-cert") pod "ingress-canary-k7gjh" (UID: "707c1483-2d89-4157-80d6-4356800a454b") : secret "canary-serving-cert" not found Apr 17 16:53:16.824170 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:16.824127 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jgf62" event={"ID":"e15ee7f9-dfd3-4121-89e2-a4eefd35413e","Type":"ContainerStarted","Data":"0ef849ec22aab9dbc1dd7eafe5a23fcadd4be456c7ec7d1c1f4bc65e1690fe08"} Apr 17 16:53:16.841433 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:16.841387 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-jgf62" podStartSLOduration=32.518069765 podStartE2EDuration="35.841374099s" podCreationTimestamp="2026-04-17 16:52:41 +0000 UTC" firstStartedPulling="2026-04-17 16:53:13.375479924 +0000 UTC m=+38.282364717" lastFinishedPulling="2026-04-17 16:53:16.698784244 +0000 UTC m=+41.605669051" observedRunningTime="2026-04-17 16:53:16.841160825 +0000 UTC m=+41.748045641" watchObservedRunningTime="2026-04-17 16:53:16.841374099 +0000 UTC m=+41.748258905" Apr 17 16:53:20.455752 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.455708 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-js8f5"] Apr 17 16:53:20.470123 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.470095 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2lg6q"] Apr 17 16:53:20.470265 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.470224 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-js8f5" Apr 17 16:53:20.474181 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.474162 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 16:53:20.477941 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.474790 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-l8tbq\"" Apr 17 16:53:20.477941 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.474906 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 16:53:20.485418 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.485395 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-js8f5"] Apr 17 16:53:20.485418 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.485420 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2lg6q"] Apr 17 16:53:20.485554 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.485505 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2lg6q" Apr 17 16:53:20.488149 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.488129 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:53:20.488253 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.488139 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-qzbdr\"" Apr 17 16:53:20.488253 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.488206 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 16:53:20.488333 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.488258 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 16:53:20.562999 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.562968 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b8319ed2-b789-4cb0-969b-0ef6032e8f49-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-js8f5\" (UID: \"b8319ed2-b789-4cb0-969b-0ef6032e8f49\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-js8f5" Apr 17 16:53:20.563118 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.563008 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ec4e7c7-c56a-4b03-8b87-f5e167104026-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2lg6q\" (UID: \"4ec4e7c7-c56a-4b03-8b87-f5e167104026\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2lg6q" Apr 17 16:53:20.563118 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.563087 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b8319ed2-b789-4cb0-969b-0ef6032e8f49-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-js8f5\" (UID: \"b8319ed2-b789-4cb0-969b-0ef6032e8f49\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-js8f5" Apr 17 16:53:20.563196 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.563127 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dwz6\" (UniqueName: \"kubernetes.io/projected/4ec4e7c7-c56a-4b03-8b87-f5e167104026-kube-api-access-2dwz6\") pod \"cluster-samples-operator-6dc5bdb6b4-2lg6q\" (UID: \"4ec4e7c7-c56a-4b03-8b87-f5e167104026\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2lg6q" Apr 17 16:53:20.566019 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.565998 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-td7gx"] Apr 17 16:53:20.578931 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.578902 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc"] Apr 17 16:53:20.579056 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.579039 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-td7gx" Apr 17 16:53:20.581428 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.581408 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:53:20.581531 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.581480 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 16:53:20.581663 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.581644 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-qlb8z\"" Apr 17 16:53:20.581781 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.581765 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 16:53:20.581867 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.581851 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 16:53:20.596196 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.596172 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-lttq7"] Apr 17 16:53:20.596342 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.596322 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" Apr 17 16:53:20.599625 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.599606 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 16:53:20.600905 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.600887 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 16:53:20.601012 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.600887 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 16:53:20.601012 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.600948 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-zm97j\"" Apr 17 16:53:20.601012 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.600894 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 16:53:20.613456 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.613439 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-588fbfbcd9-ddtmz"] Apr 17 16:53:20.613594 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.613580 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-lttq7" Apr 17 16:53:20.616644 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.616625 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 16:53:20.616726 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.616624 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-gnhxp\"" Apr 17 16:53:20.616933 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.616906 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 16:53:20.617060 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.617037 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 16:53:20.617166 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.617120 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 16:53:20.624068 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.624050 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 16:53:20.631237 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.631219 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-td7gx"] Apr 17 16:53:20.631344 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.631243 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-lttq7"] Apr 17 16:53:20.631344 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.631334 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc"] Apr 17 16:53:20.631451 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.631344 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.631451 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.631357 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-588fbfbcd9-ddtmz"] Apr 17 16:53:20.633942 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.633915 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-97wlb\"" Apr 17 16:53:20.634091 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.634078 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 16:53:20.634418 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.634404 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 16:53:20.634571 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.634460 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 16:53:20.643404 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.643378 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 16:53:20.664138 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.664044 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36f13da0-be40-496a-a83a-c62049f5690b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-td7gx\" (UID: \"36f13da0-be40-496a-a83a-c62049f5690b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-td7gx" Apr 17 16:53:20.664138 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.664080 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a8134e0-b6b3-45d5-a4b4-a9b544913d40-serving-cert\") pod \"insights-operator-585dfdc468-lttq7\" (UID: \"4a8134e0-b6b3-45d5-a4b4-a9b544913d40\") " pod="openshift-insights/insights-operator-585dfdc468-lttq7" Apr 17 16:53:20.664138 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.664108 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-image-registry-private-configuration\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.664138 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.664133 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/30e3b557-37e3-4fa9-9974-a7e12eff41fb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c7rtc\" (UID: \"30e3b557-37e3-4fa9-9974-a7e12eff41fb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" Apr 17 16:53:20.664375 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.664158 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6286\" (UniqueName: \"kubernetes.io/projected/4a8134e0-b6b3-45d5-a4b4-a9b544913d40-kube-api-access-m6286\") pod \"insights-operator-585dfdc468-lttq7\" (UID: \"4a8134e0-b6b3-45d5-a4b4-a9b544913d40\") " pod="openshift-insights/insights-operator-585dfdc468-lttq7" Apr 17 16:53:20.664375 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.664187 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36f13da0-be40-496a-a83a-c62049f5690b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-td7gx\" (UID: \"36f13da0-be40-496a-a83a-c62049f5690b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-td7gx" Apr 17 16:53:20.664375 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.664211 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/30e3b557-37e3-4fa9-9974-a7e12eff41fb-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-c7rtc\" (UID: \"30e3b557-37e3-4fa9-9974-a7e12eff41fb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" Apr 17 16:53:20.664375 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.664239 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-trusted-ca\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.664375 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.664279 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4a8134e0-b6b3-45d5-a4b4-a9b544913d40-snapshots\") pod \"insights-operator-585dfdc468-lttq7\" (UID: \"4a8134e0-b6b3-45d5-a4b4-a9b544913d40\") " pod="openshift-insights/insights-operator-585dfdc468-lttq7" Apr 17 16:53:20.664375 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.664309 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2dwz6\" (UniqueName: \"kubernetes.io/projected/4ec4e7c7-c56a-4b03-8b87-f5e167104026-kube-api-access-2dwz6\") pod \"cluster-samples-operator-6dc5bdb6b4-2lg6q\" (UID: \"4ec4e7c7-c56a-4b03-8b87-f5e167104026\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2lg6q" Apr 17 16:53:20.664375 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.664349 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcbsl\" (UniqueName: \"kubernetes.io/projected/36f13da0-be40-496a-a83a-c62049f5690b-kube-api-access-kcbsl\") pod \"kube-storage-version-migrator-operator-6769c5d45-td7gx\" (UID: \"36f13da0-be40-496a-a83a-c62049f5690b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-td7gx" Apr 17 16:53:20.664637 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.664435 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ec4e7c7-c56a-4b03-8b87-f5e167104026-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2lg6q\" (UID: \"4ec4e7c7-c56a-4b03-8b87-f5e167104026\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2lg6q" Apr 17 16:53:20.664637 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.664494 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-certificates\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.664637 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:20.664518 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:53:20.664747 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:20.664663 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ec4e7c7-c56a-4b03-8b87-f5e167104026-samples-operator-tls podName:4ec4e7c7-c56a-4b03-8b87-f5e167104026 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:21.16464192 +0000 UTC m=+46.071526726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4ec4e7c7-c56a-4b03-8b87-f5e167104026-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2lg6q" (UID: "4ec4e7c7-c56a-4b03-8b87-f5e167104026") : secret "samples-operator-tls" not found Apr 17 16:53:20.664747 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.664706 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a8134e0-b6b3-45d5-a4b4-a9b544913d40-service-ca-bundle\") pod \"insights-operator-585dfdc468-lttq7\" (UID: \"4a8134e0-b6b3-45d5-a4b4-a9b544913d40\") " pod="openshift-insights/insights-operator-585dfdc468-lttq7" Apr 17 16:53:20.664747 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.664732 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-installation-pull-secrets\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.664894 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.664780 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b8319ed2-b789-4cb0-969b-0ef6032e8f49-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-js8f5\" (UID: \"b8319ed2-b789-4cb0-969b-0ef6032e8f49\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-js8f5" Apr 17 16:53:20.664894 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.664821 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-bound-sa-token\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.664894 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.664847 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdjb6\" (UniqueName: \"kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-kube-api-access-sdjb6\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.664894 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.664886 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-tls\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.665072 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.664907 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4a8134e0-b6b3-45d5-a4b4-a9b544913d40-tmp\") pod \"insights-operator-585dfdc468-lttq7\" (UID: \"4a8134e0-b6b3-45d5-a4b4-a9b544913d40\") " pod="openshift-insights/insights-operator-585dfdc468-lttq7" Apr 17 16:53:20.665072 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:20.664941 2568 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:53:20.665072 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.664965 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a8134e0-b6b3-45d5-a4b4-a9b544913d40-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-lttq7\" (UID: \"4a8134e0-b6b3-45d5-a4b4-a9b544913d40\") " pod="openshift-insights/insights-operator-585dfdc468-lttq7" Apr 17 16:53:20.665072 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:20.664998 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8319ed2-b789-4cb0-969b-0ef6032e8f49-networking-console-plugin-cert podName:b8319ed2-b789-4cb0-969b-0ef6032e8f49 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:21.164984463 +0000 UTC m=+46.071869260 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b8319ed2-b789-4cb0-969b-0ef6032e8f49-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-js8f5" (UID: "b8319ed2-b789-4cb0-969b-0ef6032e8f49") : secret "networking-console-plugin-cert" not found Apr 17 16:53:20.665072 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.665035 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b8319ed2-b789-4cb0-969b-0ef6032e8f49-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-js8f5\" (UID: \"b8319ed2-b789-4cb0-969b-0ef6032e8f49\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-js8f5" Apr 17 16:53:20.665072 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.665053 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-ca-trust-extracted\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.665072 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.665069 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sfjl\" (UniqueName: \"kubernetes.io/projected/30e3b557-37e3-4fa9-9974-a7e12eff41fb-kube-api-access-7sfjl\") pod \"cluster-monitoring-operator-75587bd455-c7rtc\" (UID: \"30e3b557-37e3-4fa9-9974-a7e12eff41fb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" Apr 17 16:53:20.665560 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.665543 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b8319ed2-b789-4cb0-969b-0ef6032e8f49-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-js8f5\" (UID: \"b8319ed2-b789-4cb0-969b-0ef6032e8f49\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-js8f5" Apr 17 16:53:20.677371 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.677344 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dwz6\" (UniqueName: \"kubernetes.io/projected/4ec4e7c7-c56a-4b03-8b87-f5e167104026-kube-api-access-2dwz6\") pod \"cluster-samples-operator-6dc5bdb6b4-2lg6q\" (UID: \"4ec4e7c7-c56a-4b03-8b87-f5e167104026\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2lg6q" Apr 17 16:53:20.766318 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.766284 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a8134e0-b6b3-45d5-a4b4-a9b544913d40-service-ca-bundle\") pod \"insights-operator-585dfdc468-lttq7\" (UID: \"4a8134e0-b6b3-45d5-a4b4-a9b544913d40\") " pod="openshift-insights/insights-operator-585dfdc468-lttq7" Apr 17 16:53:20.766318 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.766322 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-installation-pull-secrets\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.766581 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.766559 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-bound-sa-token\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.766632 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.766591 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdjb6\" (UniqueName: \"kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-kube-api-access-sdjb6\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.766632 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.766615 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-tls\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.766762 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.766738 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4a8134e0-b6b3-45d5-a4b4-a9b544913d40-tmp\") pod \"insights-operator-585dfdc468-lttq7\" (UID: \"4a8134e0-b6b3-45d5-a4b4-a9b544913d40\") " pod="openshift-insights/insights-operator-585dfdc468-lttq7" Apr 17 16:53:20.766842 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.766781 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a8134e0-b6b3-45d5-a4b4-a9b544913d40-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-lttq7\" (UID: \"4a8134e0-b6b3-45d5-a4b4-a9b544913d40\") " pod="openshift-insights/insights-operator-585dfdc468-lttq7" Apr 17 16:53:20.766842 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:20.766802 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:53:20.766842 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:20.766818 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-588fbfbcd9-ddtmz: secret "image-registry-tls" not found Apr 17 16:53:20.766842 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.766819 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-ca-trust-extracted\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.767106 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.766850 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7sfjl\" (UniqueName: \"kubernetes.io/projected/30e3b557-37e3-4fa9-9974-a7e12eff41fb-kube-api-access-7sfjl\") pod \"cluster-monitoring-operator-75587bd455-c7rtc\" (UID: \"30e3b557-37e3-4fa9-9974-a7e12eff41fb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" Apr 17 16:53:20.767106 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:20.766883 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-tls podName:5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:21.266862281 +0000 UTC m=+46.173747076 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-tls") pod "image-registry-588fbfbcd9-ddtmz" (UID: "5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70") : secret "image-registry-tls" not found Apr 17 16:53:20.767106 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.766946 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36f13da0-be40-496a-a83a-c62049f5690b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-td7gx\" (UID: \"36f13da0-be40-496a-a83a-c62049f5690b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-td7gx" Apr 17 16:53:20.767106 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.766979 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a8134e0-b6b3-45d5-a4b4-a9b544913d40-serving-cert\") pod \"insights-operator-585dfdc468-lttq7\" (UID: \"4a8134e0-b6b3-45d5-a4b4-a9b544913d40\") " pod="openshift-insights/insights-operator-585dfdc468-lttq7" Apr 17 16:53:20.767308 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.767228 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4a8134e0-b6b3-45d5-a4b4-a9b544913d40-tmp\") pod \"insights-operator-585dfdc468-lttq7\" (UID: \"4a8134e0-b6b3-45d5-a4b4-a9b544913d40\") " pod="openshift-insights/insights-operator-585dfdc468-lttq7" Apr 17 16:53:20.767308 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.767232 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-image-registry-private-configuration\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.767308 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.767297 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/30e3b557-37e3-4fa9-9974-a7e12eff41fb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c7rtc\" (UID: \"30e3b557-37e3-4fa9-9974-a7e12eff41fb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" Apr 17 16:53:20.767452 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.767306 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-ca-trust-extracted\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.767452 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.767328 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6286\" (UniqueName: \"kubernetes.io/projected/4a8134e0-b6b3-45d5-a4b4-a9b544913d40-kube-api-access-m6286\") pod \"insights-operator-585dfdc468-lttq7\" (UID: \"4a8134e0-b6b3-45d5-a4b4-a9b544913d40\") " pod="openshift-insights/insights-operator-585dfdc468-lttq7" Apr 17 16:53:20.767452 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.767367 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36f13da0-be40-496a-a83a-c62049f5690b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-td7gx\" (UID: \"36f13da0-be40-496a-a83a-c62049f5690b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-td7gx" Apr 17 16:53:20.767452 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:20.767393 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:53:20.767452 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.767398 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/30e3b557-37e3-4fa9-9974-a7e12eff41fb-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-c7rtc\" (UID: \"30e3b557-37e3-4fa9-9974-a7e12eff41fb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" Apr 17 16:53:20.767452 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.767435 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-trusted-ca\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.767747 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:20.767455 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30e3b557-37e3-4fa9-9974-a7e12eff41fb-cluster-monitoring-operator-tls podName:30e3b557-37e3-4fa9-9974-a7e12eff41fb nodeName:}" failed. No retries permitted until 2026-04-17 16:53:21.267437823 +0000 UTC m=+46.174322637 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/30e3b557-37e3-4fa9-9974-a7e12eff41fb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-c7rtc" (UID: "30e3b557-37e3-4fa9-9974-a7e12eff41fb") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:53:20.767747 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.767561 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a8134e0-b6b3-45d5-a4b4-a9b544913d40-service-ca-bundle\") pod \"insights-operator-585dfdc468-lttq7\" (UID: \"4a8134e0-b6b3-45d5-a4b4-a9b544913d40\") " pod="openshift-insights/insights-operator-585dfdc468-lttq7" Apr 17 16:53:20.768014 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.767992 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36f13da0-be40-496a-a83a-c62049f5690b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-td7gx\" (UID: \"36f13da0-be40-496a-a83a-c62049f5690b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-td7gx" Apr 17 16:53:20.768267 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.768248 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4a8134e0-b6b3-45d5-a4b4-a9b544913d40-snapshots\") pod \"insights-operator-585dfdc468-lttq7\" (UID: \"4a8134e0-b6b3-45d5-a4b4-a9b544913d40\") " pod="openshift-insights/insights-operator-585dfdc468-lttq7" Apr 17 16:53:20.768318 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.768301 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcbsl\" (UniqueName: \"kubernetes.io/projected/36f13da0-be40-496a-a83a-c62049f5690b-kube-api-access-kcbsl\") pod \"kube-storage-version-migrator-operator-6769c5d45-td7gx\" (UID: \"36f13da0-be40-496a-a83a-c62049f5690b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-td7gx" Apr 17 16:53:20.768352 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.768335 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/30e3b557-37e3-4fa9-9974-a7e12eff41fb-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-c7rtc\" (UID: \"30e3b557-37e3-4fa9-9974-a7e12eff41fb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" Apr 17 16:53:20.768392 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.768375 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-certificates\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.768963 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.768872 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-certificates\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.769083 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.769062 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4a8134e0-b6b3-45d5-a4b4-a9b544913d40-snapshots\") pod \"insights-operator-585dfdc468-lttq7\" (UID: \"4a8134e0-b6b3-45d5-a4b4-a9b544913d40\") " pod="openshift-insights/insights-operator-585dfdc468-lttq7" Apr 17 16:53:20.769318 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.769296 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-installation-pull-secrets\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.769524 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.769501 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a8134e0-b6b3-45d5-a4b4-a9b544913d40-serving-cert\") pod \"insights-operator-585dfdc468-lttq7\" (UID: \"4a8134e0-b6b3-45d5-a4b4-a9b544913d40\") " pod="openshift-insights/insights-operator-585dfdc468-lttq7" Apr 17 16:53:20.769623 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.769585 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36f13da0-be40-496a-a83a-c62049f5690b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-td7gx\" (UID: \"36f13da0-be40-496a-a83a-c62049f5690b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-td7gx" Apr 17 16:53:20.769724 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.769705 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-image-registry-private-configuration\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.774704 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.774680 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-trusted-ca\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.775298 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.775272 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a8134e0-b6b3-45d5-a4b4-a9b544913d40-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-lttq7\" (UID: \"4a8134e0-b6b3-45d5-a4b4-a9b544913d40\") " pod="openshift-insights/insights-operator-585dfdc468-lttq7" Apr 17 16:53:20.778789 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.778770 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdjb6\" (UniqueName: \"kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-kube-api-access-sdjb6\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.779198 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.779075 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sfjl\" (UniqueName: \"kubernetes.io/projected/30e3b557-37e3-4fa9-9974-a7e12eff41fb-kube-api-access-7sfjl\") pod \"cluster-monitoring-operator-75587bd455-c7rtc\" (UID: \"30e3b557-37e3-4fa9-9974-a7e12eff41fb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" Apr 17 16:53:20.779430 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.779410 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-bound-sa-token\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:20.780242 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.780218 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6286\" (UniqueName: \"kubernetes.io/projected/4a8134e0-b6b3-45d5-a4b4-a9b544913d40-kube-api-access-m6286\") pod \"insights-operator-585dfdc468-lttq7\" (UID: \"4a8134e0-b6b3-45d5-a4b4-a9b544913d40\") " pod="openshift-insights/insights-operator-585dfdc468-lttq7" Apr 17 16:53:20.788982 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.788957 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcbsl\" (UniqueName: \"kubernetes.io/projected/36f13da0-be40-496a-a83a-c62049f5690b-kube-api-access-kcbsl\") pod \"kube-storage-version-migrator-operator-6769c5d45-td7gx\" (UID: \"36f13da0-be40-496a-a83a-c62049f5690b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-td7gx" Apr 17 16:53:20.888366 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.888341 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-td7gx" Apr 17 16:53:20.921320 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.921294 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-lttq7" Apr 17 16:53:20.997637 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:20.997580 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-td7gx"] Apr 17 16:53:21.010148 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:53:21.010115 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36f13da0_be40_496a_a83a_c62049f5690b.slice/crio-f59eda7bd6be2de51ff31aebad9b58b78d3eb93c6e96996a7886108a503a9cf5 WatchSource:0}: Error finding container f59eda7bd6be2de51ff31aebad9b58b78d3eb93c6e96996a7886108a503a9cf5: Status 404 returned error can't find the container with id f59eda7bd6be2de51ff31aebad9b58b78d3eb93c6e96996a7886108a503a9cf5 Apr 17 16:53:21.062463 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:21.062434 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-lttq7"] Apr 17 16:53:21.064989 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:53:21.064966 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a8134e0_b6b3_45d5_a4b4_a9b544913d40.slice/crio-74b42fae8453f7918f8a71fd355d94c6a107133413fab335b2f7870995b78616 WatchSource:0}: Error finding container 74b42fae8453f7918f8a71fd355d94c6a107133413fab335b2f7870995b78616: Status 404 returned error can't find the container with id 74b42fae8453f7918f8a71fd355d94c6a107133413fab335b2f7870995b78616 Apr 17 16:53:21.171371 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:21.171346 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ec4e7c7-c56a-4b03-8b87-f5e167104026-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2lg6q\" (UID: \"4ec4e7c7-c56a-4b03-8b87-f5e167104026\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2lg6q" Apr 17 16:53:21.171466 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:21.171391 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b8319ed2-b789-4cb0-969b-0ef6032e8f49-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-js8f5\" (UID: \"b8319ed2-b789-4cb0-969b-0ef6032e8f49\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-js8f5" Apr 17 16:53:21.171516 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:21.171483 2568 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:53:21.171516 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:21.171485 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:53:21.171577 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:21.171547 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8319ed2-b789-4cb0-969b-0ef6032e8f49-networking-console-plugin-cert podName:b8319ed2-b789-4cb0-969b-0ef6032e8f49 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:22.171532543 +0000 UTC m=+47.078417356 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b8319ed2-b789-4cb0-969b-0ef6032e8f49-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-js8f5" (UID: "b8319ed2-b789-4cb0-969b-0ef6032e8f49") : secret "networking-console-plugin-cert" not found Apr 17 16:53:21.171577 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:21.171561 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ec4e7c7-c56a-4b03-8b87-f5e167104026-samples-operator-tls podName:4ec4e7c7-c56a-4b03-8b87-f5e167104026 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:22.171554884 +0000 UTC m=+47.078439677 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4ec4e7c7-c56a-4b03-8b87-f5e167104026-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2lg6q" (UID: "4ec4e7c7-c56a-4b03-8b87-f5e167104026") : secret "samples-operator-tls" not found Apr 17 16:53:21.272378 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:21.272318 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/30e3b557-37e3-4fa9-9974-a7e12eff41fb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c7rtc\" (UID: \"30e3b557-37e3-4fa9-9974-a7e12eff41fb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" Apr 17 16:53:21.272476 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:21.272435 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-tls\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:21.272476 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:21.272448 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:53:21.272544 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:21.272492 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30e3b557-37e3-4fa9-9974-a7e12eff41fb-cluster-monitoring-operator-tls podName:30e3b557-37e3-4fa9-9974-a7e12eff41fb nodeName:}" failed. No retries permitted until 2026-04-17 16:53:22.272479113 +0000 UTC m=+47.179363907 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/30e3b557-37e3-4fa9-9974-a7e12eff41fb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-c7rtc" (UID: "30e3b557-37e3-4fa9-9974-a7e12eff41fb") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:53:21.272544 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:21.272526 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:53:21.272544 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:21.272538 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-588fbfbcd9-ddtmz: secret "image-registry-tls" not found Apr 17 16:53:21.272657 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:21.272593 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-tls podName:5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:22.272579364 +0000 UTC m=+47.179464176 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-tls") pod "image-registry-588fbfbcd9-ddtmz" (UID: "5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70") : secret "image-registry-tls" not found Apr 17 16:53:21.834488 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:21.834446 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-lttq7" event={"ID":"4a8134e0-b6b3-45d5-a4b4-a9b544913d40","Type":"ContainerStarted","Data":"74b42fae8453f7918f8a71fd355d94c6a107133413fab335b2f7870995b78616"} Apr 17 16:53:21.835534 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:21.835511 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-td7gx" event={"ID":"36f13da0-be40-496a-a83a-c62049f5690b","Type":"ContainerStarted","Data":"f59eda7bd6be2de51ff31aebad9b58b78d3eb93c6e96996a7886108a503a9cf5"} Apr 17 16:53:22.180103 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:22.180006 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ec4e7c7-c56a-4b03-8b87-f5e167104026-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2lg6q\" (UID: \"4ec4e7c7-c56a-4b03-8b87-f5e167104026\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2lg6q" Apr 17 16:53:22.180103 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:22.180088 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b8319ed2-b789-4cb0-969b-0ef6032e8f49-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-js8f5\" (UID: \"b8319ed2-b789-4cb0-969b-0ef6032e8f49\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-js8f5" Apr 17 16:53:22.180289 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:22.180164 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:53:22.180289 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:22.180225 2568 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:53:22.180289 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:22.180237 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ec4e7c7-c56a-4b03-8b87-f5e167104026-samples-operator-tls podName:4ec4e7c7-c56a-4b03-8b87-f5e167104026 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:24.180221219 +0000 UTC m=+49.087106015 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4ec4e7c7-c56a-4b03-8b87-f5e167104026-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2lg6q" (UID: "4ec4e7c7-c56a-4b03-8b87-f5e167104026") : secret "samples-operator-tls" not found Apr 17 16:53:22.180383 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:22.180302 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8319ed2-b789-4cb0-969b-0ef6032e8f49-networking-console-plugin-cert podName:b8319ed2-b789-4cb0-969b-0ef6032e8f49 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:24.180283491 +0000 UTC m=+49.087168298 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b8319ed2-b789-4cb0-969b-0ef6032e8f49-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-js8f5" (UID: "b8319ed2-b789-4cb0-969b-0ef6032e8f49") : secret "networking-console-plugin-cert" not found Apr 17 16:53:22.280826 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:22.280792 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-tls\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:22.281105 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:22.280863 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/30e3b557-37e3-4fa9-9974-a7e12eff41fb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c7rtc\" (UID: \"30e3b557-37e3-4fa9-9974-a7e12eff41fb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" Apr 17 16:53:22.281105 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:22.280990 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:53:22.281105 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:22.281012 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-588fbfbcd9-ddtmz: secret "image-registry-tls" not found Apr 17 16:53:22.281105 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:22.281031 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:53:22.281105 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:22.281080 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-tls podName:5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:24.281058278 +0000 UTC m=+49.187943072 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-tls") pod "image-registry-588fbfbcd9-ddtmz" (UID: "5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70") : secret "image-registry-tls" not found Apr 17 16:53:22.281105 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:22.281096 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30e3b557-37e3-4fa9-9974-a7e12eff41fb-cluster-monitoring-operator-tls podName:30e3b557-37e3-4fa9-9974-a7e12eff41fb nodeName:}" failed. No retries permitted until 2026-04-17 16:53:24.281088824 +0000 UTC m=+49.187973625 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/30e3b557-37e3-4fa9-9974-a7e12eff41fb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-c7rtc" (UID: "30e3b557-37e3-4fa9-9974-a7e12eff41fb") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:53:23.893616 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:23.893593 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9cacbec-64af-43d7-85d4-fde767a1cfa3-metrics-tls\") pod \"dns-default-bc8nx\" (UID: \"b9cacbec-64af-43d7-85d4-fde767a1cfa3\") " pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:23.893935 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:23.893638 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/707c1483-2d89-4157-80d6-4356800a454b-cert\") pod \"ingress-canary-k7gjh\" (UID: \"707c1483-2d89-4157-80d6-4356800a454b\") " pod="openshift-ingress-canary/ingress-canary-k7gjh" Apr 17 16:53:23.893935 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:23.893732 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:53:23.893935 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:23.893753 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:53:23.893935 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:23.893780 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/707c1483-2d89-4157-80d6-4356800a454b-cert podName:707c1483-2d89-4157-80d6-4356800a454b nodeName:}" failed. No retries permitted until 2026-04-17 16:53:39.893765118 +0000 UTC m=+64.800649917 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/707c1483-2d89-4157-80d6-4356800a454b-cert") pod "ingress-canary-k7gjh" (UID: "707c1483-2d89-4157-80d6-4356800a454b") : secret "canary-serving-cert" not found Apr 17 16:53:23.893935 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:23.893814 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9cacbec-64af-43d7-85d4-fde767a1cfa3-metrics-tls podName:b9cacbec-64af-43d7-85d4-fde767a1cfa3 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:39.893795064 +0000 UTC m=+64.800679862 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b9cacbec-64af-43d7-85d4-fde767a1cfa3-metrics-tls") pod "dns-default-bc8nx" (UID: "b9cacbec-64af-43d7-85d4-fde767a1cfa3") : secret "dns-default-metrics-tls" not found Apr 17 16:53:24.196329 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:24.196243 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ec4e7c7-c56a-4b03-8b87-f5e167104026-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2lg6q\" (UID: \"4ec4e7c7-c56a-4b03-8b87-f5e167104026\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2lg6q" Apr 17 16:53:24.196329 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:24.196319 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b8319ed2-b789-4cb0-969b-0ef6032e8f49-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-js8f5\" (UID: \"b8319ed2-b789-4cb0-969b-0ef6032e8f49\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-js8f5" Apr 17 16:53:24.196576 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:24.196395 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:53:24.196576 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:24.196464 2568 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:53:24.196576 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:24.196468 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ec4e7c7-c56a-4b03-8b87-f5e167104026-samples-operator-tls podName:4ec4e7c7-c56a-4b03-8b87-f5e167104026 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:28.196451038 +0000 UTC m=+53.103335835 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4ec4e7c7-c56a-4b03-8b87-f5e167104026-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2lg6q" (UID: "4ec4e7c7-c56a-4b03-8b87-f5e167104026") : secret "samples-operator-tls" not found Apr 17 16:53:24.196576 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:24.196534 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8319ed2-b789-4cb0-969b-0ef6032e8f49-networking-console-plugin-cert podName:b8319ed2-b789-4cb0-969b-0ef6032e8f49 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:28.196517698 +0000 UTC m=+53.103402506 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b8319ed2-b789-4cb0-969b-0ef6032e8f49-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-js8f5" (UID: "b8319ed2-b789-4cb0-969b-0ef6032e8f49") : secret "networking-console-plugin-cert" not found Apr 17 16:53:24.297500 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:24.297459 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-tls\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:24.297685 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:24.297516 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/30e3b557-37e3-4fa9-9974-a7e12eff41fb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c7rtc\" (UID: \"30e3b557-37e3-4fa9-9974-a7e12eff41fb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" Apr 17 16:53:24.297685 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:24.297605 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:53:24.297685 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:24.297622 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-588fbfbcd9-ddtmz: secret "image-registry-tls" not found Apr 17 16:53:24.297685 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:24.297661 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:53:24.297842 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:24.297670 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-tls podName:5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:28.297654793 +0000 UTC m=+53.204539598 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-tls") pod "image-registry-588fbfbcd9-ddtmz" (UID: "5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70") : secret "image-registry-tls" not found Apr 17 16:53:24.297842 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:24.297716 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30e3b557-37e3-4fa9-9974-a7e12eff41fb-cluster-monitoring-operator-tls podName:30e3b557-37e3-4fa9-9974-a7e12eff41fb nodeName:}" failed. No retries permitted until 2026-04-17 16:53:28.297704073 +0000 UTC m=+53.204588870 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/30e3b557-37e3-4fa9-9974-a7e12eff41fb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-c7rtc" (UID: "30e3b557-37e3-4fa9-9974-a7e12eff41fb") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:53:24.842556 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:24.842514 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-lttq7" event={"ID":"4a8134e0-b6b3-45d5-a4b4-a9b544913d40","Type":"ContainerStarted","Data":"8f4f348b737df9afd73cf7255ab1e56a100db76cddc05577c8ab9311da7e123d"} Apr 17 16:53:24.843843 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:24.843819 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-td7gx" event={"ID":"36f13da0-be40-496a-a83a-c62049f5690b","Type":"ContainerStarted","Data":"a1831fe234552656cb1e53637653d2728a40a59dacdd6817a2ab270c175c1633"} Apr 17 16:53:24.863060 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:24.863017 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-lttq7" podStartSLOduration=2.070719149 podStartE2EDuration="4.863003398s" podCreationTimestamp="2026-04-17 16:53:20 +0000 UTC" firstStartedPulling="2026-04-17 16:53:21.066748565 +0000 UTC m=+45.973633359" lastFinishedPulling="2026-04-17 16:53:23.8590328 +0000 UTC m=+48.765917608" observedRunningTime="2026-04-17 16:53:24.861989514 +0000 UTC m=+49.768874331" watchObservedRunningTime="2026-04-17 16:53:24.863003398 +0000 UTC m=+49.769888238" Apr 17 16:53:24.877406 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:24.877359 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-td7gx" podStartSLOduration=2.034865646 podStartE2EDuration="4.877347848s" podCreationTimestamp="2026-04-17 16:53:20 +0000 UTC" firstStartedPulling="2026-04-17 16:53:21.011990118 +0000 UTC m=+45.918874915" lastFinishedPulling="2026-04-17 16:53:23.854472323 +0000 UTC m=+48.761357117" observedRunningTime="2026-04-17 16:53:24.877071007 +0000 UTC m=+49.783955837" watchObservedRunningTime="2026-04-17 16:53:24.877347848 +0000 UTC m=+49.784232672" Apr 17 16:53:27.372288 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:27.372256 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pw92n_205623d6-59f4-4e27-8196-83daaf7a9d26/dns-node-resolver/0.log" Apr 17 16:53:28.230303 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:28.230270 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ec4e7c7-c56a-4b03-8b87-f5e167104026-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2lg6q\" (UID: \"4ec4e7c7-c56a-4b03-8b87-f5e167104026\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2lg6q" Apr 17 16:53:28.230487 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:28.230324 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b8319ed2-b789-4cb0-969b-0ef6032e8f49-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-js8f5\" (UID: \"b8319ed2-b789-4cb0-969b-0ef6032e8f49\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-js8f5" Apr 17 16:53:28.230487 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:28.230413 2568 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:53:28.230487 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:28.230423 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:53:28.230487 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:28.230466 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8319ed2-b789-4cb0-969b-0ef6032e8f49-networking-console-plugin-cert podName:b8319ed2-b789-4cb0-969b-0ef6032e8f49 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:36.230451548 +0000 UTC m=+61.137336345 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b8319ed2-b789-4cb0-969b-0ef6032e8f49-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-js8f5" (UID: "b8319ed2-b789-4cb0-969b-0ef6032e8f49") : secret "networking-console-plugin-cert" not found Apr 17 16:53:28.230487 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:28.230480 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ec4e7c7-c56a-4b03-8b87-f5e167104026-samples-operator-tls podName:4ec4e7c7-c56a-4b03-8b87-f5e167104026 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:36.230473761 +0000 UTC m=+61.137358555 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4ec4e7c7-c56a-4b03-8b87-f5e167104026-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2lg6q" (UID: "4ec4e7c7-c56a-4b03-8b87-f5e167104026") : secret "samples-operator-tls" not found Apr 17 16:53:28.331666 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:28.331630 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-tls\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:28.331803 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:28.331680 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/30e3b557-37e3-4fa9-9974-a7e12eff41fb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c7rtc\" (UID: \"30e3b557-37e3-4fa9-9974-a7e12eff41fb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" Apr 17 16:53:28.331803 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:28.331768 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:53:28.331803 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:28.331785 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-588fbfbcd9-ddtmz: secret "image-registry-tls" not found Apr 17 16:53:28.331891 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:28.331810 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:53:28.331891 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:28.331842 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-tls podName:5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:36.331824077 +0000 UTC m=+61.238708874 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-tls") pod "image-registry-588fbfbcd9-ddtmz" (UID: "5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70") : secret "image-registry-tls" not found Apr 17 16:53:28.331891 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:28.331856 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30e3b557-37e3-4fa9-9974-a7e12eff41fb-cluster-monitoring-operator-tls podName:30e3b557-37e3-4fa9-9974-a7e12eff41fb nodeName:}" failed. No retries permitted until 2026-04-17 16:53:36.331849933 +0000 UTC m=+61.238734727 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/30e3b557-37e3-4fa9-9974-a7e12eff41fb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-c7rtc" (UID: "30e3b557-37e3-4fa9-9974-a7e12eff41fb") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:53:28.372104 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:28.372080 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ndxqw_66341dd8-b441-446a-be14-71280c6960b2/node-ca/0.log" Apr 17 16:53:29.518868 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:29.518838 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-v9mh6"] Apr 17 16:53:29.555384 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:29.555351 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-v9mh6"] Apr 17 16:53:29.555384 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:29.555365 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-v9mh6" Apr 17 16:53:29.558669 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:29.558652 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 16:53:29.559675 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:29.559655 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 16:53:29.559778 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:29.559684 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 16:53:29.559778 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:29.559710 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 16:53:29.559915 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:29.559893 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-xkgfl\"" Apr 17 16:53:29.642688 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:29.642668 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr5v5\" (UniqueName: \"kubernetes.io/projected/a7450313-0679-4b21-85ed-95035e064ae4-kube-api-access-mr5v5\") pod \"service-ca-865cb79987-v9mh6\" (UID: \"a7450313-0679-4b21-85ed-95035e064ae4\") " pod="openshift-service-ca/service-ca-865cb79987-v9mh6" Apr 17 16:53:29.642799 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:29.642749 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a7450313-0679-4b21-85ed-95035e064ae4-signing-cabundle\") pod \"service-ca-865cb79987-v9mh6\" (UID: \"a7450313-0679-4b21-85ed-95035e064ae4\") " pod="openshift-service-ca/service-ca-865cb79987-v9mh6" Apr 17 16:53:29.642799 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:29.642775 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a7450313-0679-4b21-85ed-95035e064ae4-signing-key\") pod \"service-ca-865cb79987-v9mh6\" (UID: \"a7450313-0679-4b21-85ed-95035e064ae4\") " pod="openshift-service-ca/service-ca-865cb79987-v9mh6" Apr 17 16:53:29.743527 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:29.743499 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mr5v5\" (UniqueName: \"kubernetes.io/projected/a7450313-0679-4b21-85ed-95035e064ae4-kube-api-access-mr5v5\") pod \"service-ca-865cb79987-v9mh6\" (UID: \"a7450313-0679-4b21-85ed-95035e064ae4\") " pod="openshift-service-ca/service-ca-865cb79987-v9mh6" Apr 17 16:53:29.743676 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:29.743586 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a7450313-0679-4b21-85ed-95035e064ae4-signing-cabundle\") pod \"service-ca-865cb79987-v9mh6\" (UID: \"a7450313-0679-4b21-85ed-95035e064ae4\") " pod="openshift-service-ca/service-ca-865cb79987-v9mh6" Apr 17 16:53:29.743676 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:29.743619 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a7450313-0679-4b21-85ed-95035e064ae4-signing-key\") pod \"service-ca-865cb79987-v9mh6\" (UID: \"a7450313-0679-4b21-85ed-95035e064ae4\") " pod="openshift-service-ca/service-ca-865cb79987-v9mh6" Apr 17 16:53:29.758564 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:29.758536 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a7450313-0679-4b21-85ed-95035e064ae4-signing-cabundle\") pod \"service-ca-865cb79987-v9mh6\" (UID: \"a7450313-0679-4b21-85ed-95035e064ae4\") " pod="openshift-service-ca/service-ca-865cb79987-v9mh6" Apr 17 16:53:29.758564 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:29.758559 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a7450313-0679-4b21-85ed-95035e064ae4-signing-key\") pod \"service-ca-865cb79987-v9mh6\" (UID: \"a7450313-0679-4b21-85ed-95035e064ae4\") " pod="openshift-service-ca/service-ca-865cb79987-v9mh6" Apr 17 16:53:29.758718 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:29.758563 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr5v5\" (UniqueName: \"kubernetes.io/projected/a7450313-0679-4b21-85ed-95035e064ae4-kube-api-access-mr5v5\") pod \"service-ca-865cb79987-v9mh6\" (UID: \"a7450313-0679-4b21-85ed-95035e064ae4\") " pod="openshift-service-ca/service-ca-865cb79987-v9mh6" Apr 17 16:53:29.864251 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:29.864175 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-v9mh6" Apr 17 16:53:29.979627 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:29.979594 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-v9mh6"] Apr 17 16:53:29.983296 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:53:29.983266 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7450313_0679_4b21_85ed_95035e064ae4.slice/crio-b49c7a9c79c528dde333a88ce0daa57b32d0efec8602f90e02e1a898a90f8eaf WatchSource:0}: Error finding container b49c7a9c79c528dde333a88ce0daa57b32d0efec8602f90e02e1a898a90f8eaf: Status 404 returned error can't find the container with id b49c7a9c79c528dde333a88ce0daa57b32d0efec8602f90e02e1a898a90f8eaf Apr 17 16:53:30.856401 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:30.856360 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-v9mh6" event={"ID":"a7450313-0679-4b21-85ed-95035e064ae4","Type":"ContainerStarted","Data":"b49c7a9c79c528dde333a88ce0daa57b32d0efec8602f90e02e1a898a90f8eaf"} Apr 17 16:53:31.860072 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:31.860033 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-v9mh6" event={"ID":"a7450313-0679-4b21-85ed-95035e064ae4","Type":"ContainerStarted","Data":"b8f0519809d7f0992b2c94d051a7e958b31350063a2b6093c006669140190512"} Apr 17 16:53:31.881356 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:31.881312 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-v9mh6" podStartSLOduration=1.18200121 podStartE2EDuration="2.881299345s" podCreationTimestamp="2026-04-17 16:53:29 +0000 UTC" firstStartedPulling="2026-04-17 16:53:29.98479629 +0000 UTC m=+54.891681085" lastFinishedPulling="2026-04-17 16:53:31.68409441 +0000 UTC m=+56.590979220" observedRunningTime="2026-04-17 16:53:31.88027178 +0000 UTC m=+56.787156596" watchObservedRunningTime="2026-04-17 16:53:31.881299345 +0000 UTC m=+56.788184160" Apr 17 16:53:36.299984 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:36.299941 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ec4e7c7-c56a-4b03-8b87-f5e167104026-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2lg6q\" (UID: \"4ec4e7c7-c56a-4b03-8b87-f5e167104026\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2lg6q" Apr 17 16:53:36.300425 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:36.300018 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b8319ed2-b789-4cb0-969b-0ef6032e8f49-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-js8f5\" (UID: \"b8319ed2-b789-4cb0-969b-0ef6032e8f49\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-js8f5" Apr 17 16:53:36.300425 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:36.300140 2568 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:53:36.300425 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:36.300224 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8319ed2-b789-4cb0-969b-0ef6032e8f49-networking-console-plugin-cert podName:b8319ed2-b789-4cb0-969b-0ef6032e8f49 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:52.300203592 +0000 UTC m=+77.207088390 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b8319ed2-b789-4cb0-969b-0ef6032e8f49-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-js8f5" (UID: "b8319ed2-b789-4cb0-969b-0ef6032e8f49") : secret "networking-console-plugin-cert" not found Apr 17 16:53:36.302224 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:36.302197 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ec4e7c7-c56a-4b03-8b87-f5e167104026-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2lg6q\" (UID: \"4ec4e7c7-c56a-4b03-8b87-f5e167104026\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2lg6q" Apr 17 16:53:36.397028 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:36.397001 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-qzbdr\"" Apr 17 16:53:36.401310 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:36.401289 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-tls\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:36.401421 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:36.401338 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/30e3b557-37e3-4fa9-9974-a7e12eff41fb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c7rtc\" (UID: \"30e3b557-37e3-4fa9-9974-a7e12eff41fb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" Apr 17 16:53:36.401481 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:36.401441 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:53:36.401533 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:36.401509 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30e3b557-37e3-4fa9-9974-a7e12eff41fb-cluster-monitoring-operator-tls podName:30e3b557-37e3-4fa9-9974-a7e12eff41fb nodeName:}" failed. No retries permitted until 2026-04-17 16:53:52.401489982 +0000 UTC m=+77.308374794 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/30e3b557-37e3-4fa9-9974-a7e12eff41fb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-c7rtc" (UID: "30e3b557-37e3-4fa9-9974-a7e12eff41fb") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:53:36.403477 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:36.403457 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-tls\") pod \"image-registry-588fbfbcd9-ddtmz\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:36.404261 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:36.404240 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2lg6q" Apr 17 16:53:36.531753 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:36.531724 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2lg6q"] Apr 17 16:53:36.546130 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:36.546109 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-97wlb\"" Apr 17 16:53:36.553566 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:36.553514 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:36.670915 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:36.670883 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-588fbfbcd9-ddtmz"] Apr 17 16:53:36.673677 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:53:36.673644 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d5d3a6d_2d6e_4bf8_bcc4_8c44ed244a70.slice/crio-fdd43cb070dff14c828be0f2e06c9d7002075497b9d80188703be49dc7b1215f WatchSource:0}: Error finding container fdd43cb070dff14c828be0f2e06c9d7002075497b9d80188703be49dc7b1215f: Status 404 returned error can't find the container with id fdd43cb070dff14c828be0f2e06c9d7002075497b9d80188703be49dc7b1215f Apr 17 16:53:36.871552 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:36.871464 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2lg6q" event={"ID":"4ec4e7c7-c56a-4b03-8b87-f5e167104026","Type":"ContainerStarted","Data":"8d2d0ea298c12cda392aca0afd2fe27af5ff563e2b802afcf3c3fbf75d32b77b"} Apr 17 16:53:36.872905 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:36.872872 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" event={"ID":"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70","Type":"ContainerStarted","Data":"005ca94f7ff507962c0af1d70423cc4c2d4cf0826cebdb1701c5cc7c04f887e0"} Apr 17 16:53:36.872905 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:36.872907 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" event={"ID":"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70","Type":"ContainerStarted","Data":"fdd43cb070dff14c828be0f2e06c9d7002075497b9d80188703be49dc7b1215f"} Apr 17 16:53:36.873084 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:36.873018 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:53:36.898104 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:36.898046 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" podStartSLOduration=16.898028173 podStartE2EDuration="16.898028173s" podCreationTimestamp="2026-04-17 16:53:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:53:36.896809627 +0000 UTC m=+61.803694467" watchObservedRunningTime="2026-04-17 16:53:36.898028173 +0000 UTC m=+61.804912991" Apr 17 16:53:38.879437 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:38.879406 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2lg6q" event={"ID":"4ec4e7c7-c56a-4b03-8b87-f5e167104026","Type":"ContainerStarted","Data":"21677f30ab444ee0e5ba542fbaaaac03a8b58cff391916b66dbbd97a08e39624"} Apr 17 16:53:39.884295 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:39.884259 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2lg6q" event={"ID":"4ec4e7c7-c56a-4b03-8b87-f5e167104026","Type":"ContainerStarted","Data":"8e63b9d9eebce77cf95c668047bdebd8be3dca6c9f340cdd9a6d15c21b0d7516"} Apr 17 16:53:39.904899 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:39.904855 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2lg6q" podStartSLOduration=17.791053056 podStartE2EDuration="19.904843113s" podCreationTimestamp="2026-04-17 16:53:20 +0000 UTC" firstStartedPulling="2026-04-17 16:53:36.580440864 +0000 UTC m=+61.487325658" lastFinishedPulling="2026-04-17 16:53:38.694230918 +0000 UTC m=+63.601115715" observedRunningTime="2026-04-17 16:53:39.90334985 +0000 UTC m=+64.810234665" watchObservedRunningTime="2026-04-17 16:53:39.904843113 +0000 UTC m=+64.811727928" Apr 17 16:53:39.932644 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:39.932616 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9cacbec-64af-43d7-85d4-fde767a1cfa3-metrics-tls\") pod \"dns-default-bc8nx\" (UID: \"b9cacbec-64af-43d7-85d4-fde767a1cfa3\") " pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:39.932762 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:39.932673 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/707c1483-2d89-4157-80d6-4356800a454b-cert\") pod \"ingress-canary-k7gjh\" (UID: \"707c1483-2d89-4157-80d6-4356800a454b\") " pod="openshift-ingress-canary/ingress-canary-k7gjh" Apr 17 16:53:39.935073 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:39.935049 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9cacbec-64af-43d7-85d4-fde767a1cfa3-metrics-tls\") pod \"dns-default-bc8nx\" (UID: \"b9cacbec-64af-43d7-85d4-fde767a1cfa3\") " pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:39.935136 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:39.935103 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/707c1483-2d89-4157-80d6-4356800a454b-cert\") pod \"ingress-canary-k7gjh\" (UID: \"707c1483-2d89-4157-80d6-4356800a454b\") " pod="openshift-ingress-canary/ingress-canary-k7gjh" Apr 17 16:53:40.099227 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:40.099197 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8mbmc\"" Apr 17 16:53:40.105675 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:40.105652 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bkp6b\"" Apr 17 16:53:40.106696 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:40.106678 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:40.113977 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:40.113958 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k7gjh" Apr 17 16:53:40.239620 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:40.239498 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bc8nx"] Apr 17 16:53:40.241572 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:53:40.241540 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9cacbec_64af_43d7_85d4_fde767a1cfa3.slice/crio-d168968194ef47023d42561a010d59aa61ce230fe54618df6065f780bb8c6d6c WatchSource:0}: Error finding container d168968194ef47023d42561a010d59aa61ce230fe54618df6065f780bb8c6d6c: Status 404 returned error can't find the container with id d168968194ef47023d42561a010d59aa61ce230fe54618df6065f780bb8c6d6c Apr 17 16:53:40.251892 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:40.251870 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k7gjh"] Apr 17 16:53:40.254988 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:53:40.254967 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod707c1483_2d89_4157_80d6_4356800a454b.slice/crio-83370e5706e348a0d2c694eb99456d35aa7866436344d38c9bd26e3ee9639f29 WatchSource:0}: Error finding container 83370e5706e348a0d2c694eb99456d35aa7866436344d38c9bd26e3ee9639f29: Status 404 returned error can't find the container with id 83370e5706e348a0d2c694eb99456d35aa7866436344d38c9bd26e3ee9639f29 Apr 17 16:53:40.889661 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:40.889605 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bc8nx" event={"ID":"b9cacbec-64af-43d7-85d4-fde767a1cfa3","Type":"ContainerStarted","Data":"d168968194ef47023d42561a010d59aa61ce230fe54618df6065f780bb8c6d6c"} Apr 17 16:53:40.891138 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:40.891110 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k7gjh" event={"ID":"707c1483-2d89-4157-80d6-4356800a454b","Type":"ContainerStarted","Data":"83370e5706e348a0d2c694eb99456d35aa7866436344d38c9bd26e3ee9639f29"} Apr 17 16:53:41.444047 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:41.444008 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs\") pod \"network-metrics-daemon-bcjnr\" (UID: \"1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3\") " pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:53:41.447570 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:41.447543 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:53:41.458045 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:41.457990 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3-metrics-certs\") pod \"network-metrics-daemon-bcjnr\" (UID: \"1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3\") " pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:53:41.473047 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:41.472967 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bkwcz\"" Apr 17 16:53:41.480413 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:41.480389 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcjnr" Apr 17 16:53:41.545319 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:41.545245 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lz7r\" (UniqueName: \"kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r\") pod \"network-check-target-blsng\" (UID: \"33d21ed2-8e33-49bf-a161-a1a1a93a72d8\") " pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:53:41.548697 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:41.548670 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:53:41.559239 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:41.559218 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:53:41.569212 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:41.569191 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lz7r\" (UniqueName: \"kubernetes.io/projected/33d21ed2-8e33-49bf-a161-a1a1a93a72d8-kube-api-access-5lz7r\") pod \"network-check-target-blsng\" (UID: \"33d21ed2-8e33-49bf-a161-a1a1a93a72d8\") " pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:53:41.778337 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:41.778306 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wm7r2\"" Apr 17 16:53:41.786009 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:41.785979 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:53:42.377714 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:42.377667 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-blsng"] Apr 17 16:53:42.383799 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:53:42.383771 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33d21ed2_8e33_49bf_a161_a1a1a93a72d8.slice/crio-8f66fd4bfe6411e9704ee1313595b6e1ba515efe0d4130a84ce6bb8a0845bd36 WatchSource:0}: Error finding container 8f66fd4bfe6411e9704ee1313595b6e1ba515efe0d4130a84ce6bb8a0845bd36: Status 404 returned error can't find the container with id 8f66fd4bfe6411e9704ee1313595b6e1ba515efe0d4130a84ce6bb8a0845bd36 Apr 17 16:53:42.389840 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:42.389799 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bcjnr"] Apr 17 16:53:42.394013 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:53:42.393988 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e75e15a_4a21_46fc_8ab6_d31ca6ee91f3.slice/crio-3861052e08e4183df2346b870e857bd9765fe36ee8ee54d1c453f8538492fe7f WatchSource:0}: Error finding container 3861052e08e4183df2346b870e857bd9765fe36ee8ee54d1c453f8538492fe7f: Status 404 returned error can't find the container with id 3861052e08e4183df2346b870e857bd9765fe36ee8ee54d1c453f8538492fe7f Apr 17 16:53:42.901665 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:42.901553 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k7gjh" event={"ID":"707c1483-2d89-4157-80d6-4356800a454b","Type":"ContainerStarted","Data":"f0ed16497afe1bbf6a5779ee9782d1b988e2d4d288a9e8191d466b0a23707b26"} Apr 17 16:53:42.902863 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:42.902831 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-blsng" event={"ID":"33d21ed2-8e33-49bf-a161-a1a1a93a72d8","Type":"ContainerStarted","Data":"8f66fd4bfe6411e9704ee1313595b6e1ba515efe0d4130a84ce6bb8a0845bd36"} Apr 17 16:53:42.904764 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:42.904738 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bc8nx" event={"ID":"b9cacbec-64af-43d7-85d4-fde767a1cfa3","Type":"ContainerStarted","Data":"d3a5f33e6ce0e75bb387a9b22331f2505530a423c1fbf1cbbf3bdfa8349889a3"} Apr 17 16:53:42.904875 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:42.904771 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bc8nx" event={"ID":"b9cacbec-64af-43d7-85d4-fde767a1cfa3","Type":"ContainerStarted","Data":"2474a82bd291cc876d06a4869052576182ba216a23dc1108fcdd7b1b00061b51"} Apr 17 16:53:42.904962 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:42.904949 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:42.906044 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:42.906023 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bcjnr" event={"ID":"1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3","Type":"ContainerStarted","Data":"3861052e08e4183df2346b870e857bd9765fe36ee8ee54d1c453f8538492fe7f"} Apr 17 16:53:42.919010 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:42.918913 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-k7gjh" podStartSLOduration=33.968099942 podStartE2EDuration="35.918898516s" podCreationTimestamp="2026-04-17 16:53:07 +0000 UTC" firstStartedPulling="2026-04-17 16:53:40.256823985 +0000 UTC m=+65.163708787" lastFinishedPulling="2026-04-17 16:53:42.207622553 +0000 UTC m=+67.114507361" observedRunningTime="2026-04-17 16:53:42.917543062 +0000 UTC m=+67.824427879" watchObservedRunningTime="2026-04-17 16:53:42.918898516 +0000 UTC m=+67.825783372" Apr 17 16:53:42.934118 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:42.934017 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bc8nx" podStartSLOduration=33.975164912 podStartE2EDuration="35.934002702s" podCreationTimestamp="2026-04-17 16:53:07 +0000 UTC" firstStartedPulling="2026-04-17 16:53:40.243580723 +0000 UTC m=+65.150465523" lastFinishedPulling="2026-04-17 16:53:42.202418515 +0000 UTC m=+67.109303313" observedRunningTime="2026-04-17 16:53:42.932934115 +0000 UTC m=+67.839818924" watchObservedRunningTime="2026-04-17 16:53:42.934002702 +0000 UTC m=+67.840887520" Apr 17 16:53:43.912933 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:43.912880 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bcjnr" event={"ID":"1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3","Type":"ContainerStarted","Data":"ae1919cdfe15e1c533a1f44c3fcaf43d134c544612e3f6e885398b96374d87a9"} Apr 17 16:53:43.913401 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:43.912943 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bcjnr" event={"ID":"1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3","Type":"ContainerStarted","Data":"f43821eadadbee4b54ff2d366aefc88420d5972467bc5c35591b2f01d3086c46"} Apr 17 16:53:43.929798 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:43.929739 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bcjnr" podStartSLOduration=67.881770674 podStartE2EDuration="1m8.929721126s" podCreationTimestamp="2026-04-17 16:52:35 +0000 UTC" firstStartedPulling="2026-04-17 16:53:42.397228289 +0000 UTC m=+67.304113089" lastFinishedPulling="2026-04-17 16:53:43.445178746 +0000 UTC m=+68.352063541" observedRunningTime="2026-04-17 16:53:43.928179116 +0000 UTC m=+68.835063932" watchObservedRunningTime="2026-04-17 16:53:43.929721126 +0000 UTC m=+68.836605944" Apr 17 16:53:45.919354 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:45.919315 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-blsng" event={"ID":"33d21ed2-8e33-49bf-a161-a1a1a93a72d8","Type":"ContainerStarted","Data":"b933e5e7b608678a2fb0f6e3234f06a2d8bde2d30d455fb03e1129ac746db9a6"} Apr 17 16:53:45.919726 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:45.919581 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:53:45.935030 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:45.934984 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-blsng" podStartSLOduration=68.207798903 podStartE2EDuration="1m10.934969521s" podCreationTimestamp="2026-04-17 16:52:35 +0000 UTC" firstStartedPulling="2026-04-17 16:53:42.388523129 +0000 UTC m=+67.295407930" lastFinishedPulling="2026-04-17 16:53:45.115693751 +0000 UTC m=+70.022578548" observedRunningTime="2026-04-17 16:53:45.934226844 +0000 UTC m=+70.841111659" watchObservedRunningTime="2026-04-17 16:53:45.934969521 +0000 UTC m=+70.841854338" Apr 17 16:53:51.014016 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.013987 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-588fbfbcd9-ddtmz"] Apr 17 16:53:51.018153 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.018121 2568 patch_prober.go:28] interesting pod/image-registry-588fbfbcd9-ddtmz container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 16:53:51.018298 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.018174 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" podUID="5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:53:51.093963 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.093910 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-jh5wg"] Apr 17 16:53:51.097160 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.097140 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jh5wg" Apr 17 16:53:51.099745 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.099726 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-whh7q\"" Apr 17 16:53:51.100234 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.100216 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 16:53:51.102650 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.102633 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 16:53:51.118656 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.118631 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jh5wg"] Apr 17 16:53:51.222244 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.222212 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w76cx\" (UniqueName: \"kubernetes.io/projected/ea016832-b0d1-48d2-ac01-0d47d66fdf3e-kube-api-access-w76cx\") pod \"insights-runtime-extractor-jh5wg\" (UID: \"ea016832-b0d1-48d2-ac01-0d47d66fdf3e\") " pod="openshift-insights/insights-runtime-extractor-jh5wg" Apr 17 16:53:51.222420 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.222259 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ea016832-b0d1-48d2-ac01-0d47d66fdf3e-data-volume\") pod \"insights-runtime-extractor-jh5wg\" (UID: \"ea016832-b0d1-48d2-ac01-0d47d66fdf3e\") " pod="openshift-insights/insights-runtime-extractor-jh5wg" Apr 17 16:53:51.222420 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.222364 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ea016832-b0d1-48d2-ac01-0d47d66fdf3e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jh5wg\" (UID: \"ea016832-b0d1-48d2-ac01-0d47d66fdf3e\") " pod="openshift-insights/insights-runtime-extractor-jh5wg" Apr 17 16:53:51.222420 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.222394 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ea016832-b0d1-48d2-ac01-0d47d66fdf3e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jh5wg\" (UID: \"ea016832-b0d1-48d2-ac01-0d47d66fdf3e\") " pod="openshift-insights/insights-runtime-extractor-jh5wg" Apr 17 16:53:51.222543 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.222428 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ea016832-b0d1-48d2-ac01-0d47d66fdf3e-crio-socket\") pod \"insights-runtime-extractor-jh5wg\" (UID: \"ea016832-b0d1-48d2-ac01-0d47d66fdf3e\") " pod="openshift-insights/insights-runtime-extractor-jh5wg" Apr 17 16:53:51.323316 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.323229 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ea016832-b0d1-48d2-ac01-0d47d66fdf3e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jh5wg\" (UID: \"ea016832-b0d1-48d2-ac01-0d47d66fdf3e\") " pod="openshift-insights/insights-runtime-extractor-jh5wg" Apr 17 16:53:51.323316 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.323268 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ea016832-b0d1-48d2-ac01-0d47d66fdf3e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jh5wg\" (UID: \"ea016832-b0d1-48d2-ac01-0d47d66fdf3e\") " pod="openshift-insights/insights-runtime-extractor-jh5wg" Apr 17 16:53:51.323316 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.323314 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ea016832-b0d1-48d2-ac01-0d47d66fdf3e-crio-socket\") pod \"insights-runtime-extractor-jh5wg\" (UID: \"ea016832-b0d1-48d2-ac01-0d47d66fdf3e\") " pod="openshift-insights/insights-runtime-extractor-jh5wg" Apr 17 16:53:51.323743 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.323366 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w76cx\" (UniqueName: \"kubernetes.io/projected/ea016832-b0d1-48d2-ac01-0d47d66fdf3e-kube-api-access-w76cx\") pod \"insights-runtime-extractor-jh5wg\" (UID: \"ea016832-b0d1-48d2-ac01-0d47d66fdf3e\") " pod="openshift-insights/insights-runtime-extractor-jh5wg" Apr 17 16:53:51.323743 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.323393 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ea016832-b0d1-48d2-ac01-0d47d66fdf3e-data-volume\") pod \"insights-runtime-extractor-jh5wg\" (UID: \"ea016832-b0d1-48d2-ac01-0d47d66fdf3e\") " pod="openshift-insights/insights-runtime-extractor-jh5wg" Apr 17 16:53:51.323743 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.323428 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ea016832-b0d1-48d2-ac01-0d47d66fdf3e-crio-socket\") pod \"insights-runtime-extractor-jh5wg\" (UID: \"ea016832-b0d1-48d2-ac01-0d47d66fdf3e\") " pod="openshift-insights/insights-runtime-extractor-jh5wg" Apr 17 16:53:51.323743 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.323714 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ea016832-b0d1-48d2-ac01-0d47d66fdf3e-data-volume\") pod \"insights-runtime-extractor-jh5wg\" (UID: \"ea016832-b0d1-48d2-ac01-0d47d66fdf3e\") " pod="openshift-insights/insights-runtime-extractor-jh5wg" Apr 17 16:53:51.323867 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.323779 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ea016832-b0d1-48d2-ac01-0d47d66fdf3e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jh5wg\" (UID: \"ea016832-b0d1-48d2-ac01-0d47d66fdf3e\") " pod="openshift-insights/insights-runtime-extractor-jh5wg" Apr 17 16:53:51.325692 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.325664 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ea016832-b0d1-48d2-ac01-0d47d66fdf3e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jh5wg\" (UID: \"ea016832-b0d1-48d2-ac01-0d47d66fdf3e\") " pod="openshift-insights/insights-runtime-extractor-jh5wg" Apr 17 16:53:51.345302 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.345276 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w76cx\" (UniqueName: \"kubernetes.io/projected/ea016832-b0d1-48d2-ac01-0d47d66fdf3e-kube-api-access-w76cx\") pod \"insights-runtime-extractor-jh5wg\" (UID: \"ea016832-b0d1-48d2-ac01-0d47d66fdf3e\") " pod="openshift-insights/insights-runtime-extractor-jh5wg" Apr 17 16:53:51.406458 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.406432 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jh5wg" Apr 17 16:53:51.520238 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.520211 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jh5wg"] Apr 17 16:53:51.523509 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:53:51.523473 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea016832_b0d1_48d2_ac01_0d47d66fdf3e.slice/crio-ddc8b5928b3511bb3380323ec1d743bdb52bb9f109cb63c120263cd0cc174aed WatchSource:0}: Error finding container ddc8b5928b3511bb3380323ec1d743bdb52bb9f109cb63c120263cd0cc174aed: Status 404 returned error can't find the container with id ddc8b5928b3511bb3380323ec1d743bdb52bb9f109cb63c120263cd0cc174aed Apr 17 16:53:51.936139 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.936039 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jh5wg" event={"ID":"ea016832-b0d1-48d2-ac01-0d47d66fdf3e","Type":"ContainerStarted","Data":"c9f571fef2c637dc8e19aa974b733b940388f98404b5726a185ba25780ea36c9"} Apr 17 16:53:51.936139 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:51.936081 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jh5wg" event={"ID":"ea016832-b0d1-48d2-ac01-0d47d66fdf3e","Type":"ContainerStarted","Data":"ddc8b5928b3511bb3380323ec1d743bdb52bb9f109cb63c120263cd0cc174aed"} Apr 17 16:53:52.332042 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:52.332018 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b8319ed2-b789-4cb0-969b-0ef6032e8f49-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-js8f5\" (UID: \"b8319ed2-b789-4cb0-969b-0ef6032e8f49\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-js8f5" Apr 17 16:53:52.334167 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:52.334149 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b8319ed2-b789-4cb0-969b-0ef6032e8f49-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-js8f5\" (UID: \"b8319ed2-b789-4cb0-969b-0ef6032e8f49\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-js8f5" Apr 17 16:53:52.432368 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:52.432338 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/30e3b557-37e3-4fa9-9974-a7e12eff41fb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c7rtc\" (UID: \"30e3b557-37e3-4fa9-9974-a7e12eff41fb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" Apr 17 16:53:52.434617 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:52.434586 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/30e3b557-37e3-4fa9-9974-a7e12eff41fb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c7rtc\" (UID: \"30e3b557-37e3-4fa9-9974-a7e12eff41fb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" Apr 17 16:53:52.583759 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:52.583678 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-l8tbq\"" Apr 17 16:53:52.591770 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:52.591746 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-js8f5" Apr 17 16:53:52.705683 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:52.705649 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-js8f5"] Apr 17 16:53:52.707775 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:52.707752 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-zm97j\"" Apr 17 16:53:52.708425 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:53:52.708396 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8319ed2_b789_4cb0_969b_0ef6032e8f49.slice/crio-a35d195743baa00452a9d9fb0da7dc97f9ff313d26fb5b32615df2ef1737928e WatchSource:0}: Error finding container a35d195743baa00452a9d9fb0da7dc97f9ff313d26fb5b32615df2ef1737928e: Status 404 returned error can't find the container with id a35d195743baa00452a9d9fb0da7dc97f9ff313d26fb5b32615df2ef1737928e Apr 17 16:53:52.715639 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:52.715619 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" Apr 17 16:53:52.833183 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:52.833144 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc"] Apr 17 16:53:52.837333 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:53:52.837307 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30e3b557_37e3_4fa9_9974_a7e12eff41fb.slice/crio-77d6949679b5d18ca428f04253917d115513ccf079376089fc1a2f09a4f1a00b WatchSource:0}: Error finding container 77d6949679b5d18ca428f04253917d115513ccf079376089fc1a2f09a4f1a00b: Status 404 returned error can't find the container with id 77d6949679b5d18ca428f04253917d115513ccf079376089fc1a2f09a4f1a00b Apr 17 16:53:52.915850 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:52.915817 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bc8nx" Apr 17 16:53:52.941064 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:52.941024 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jh5wg" event={"ID":"ea016832-b0d1-48d2-ac01-0d47d66fdf3e","Type":"ContainerStarted","Data":"85f269dc879ee4ce16d91a83dd423667c8ea6710b8b488a76f63837082333fc7"} Apr 17 16:53:52.942206 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:52.942177 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" event={"ID":"30e3b557-37e3-4fa9-9974-a7e12eff41fb","Type":"ContainerStarted","Data":"77d6949679b5d18ca428f04253917d115513ccf079376089fc1a2f09a4f1a00b"} Apr 17 16:53:52.943466 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:52.943413 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-js8f5" event={"ID":"b8319ed2-b789-4cb0-969b-0ef6032e8f49","Type":"ContainerStarted","Data":"a35d195743baa00452a9d9fb0da7dc97f9ff313d26fb5b32615df2ef1737928e"} Apr 17 16:53:54.950219 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:54.950139 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jh5wg" event={"ID":"ea016832-b0d1-48d2-ac01-0d47d66fdf3e","Type":"ContainerStarted","Data":"657c8d36d9193620d41de56a685185839e394acf2bcc6efbfeab8f91c244d5bc"} Apr 17 16:53:54.951605 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:54.951568 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" event={"ID":"30e3b557-37e3-4fa9-9974-a7e12eff41fb","Type":"ContainerStarted","Data":"e9894464e958d3b480e92aa56a4bb9608082dda0b1046c16d4064bdc8cddcf18"} Apr 17 16:53:54.952900 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:54.952872 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-js8f5" event={"ID":"b8319ed2-b789-4cb0-969b-0ef6032e8f49","Type":"ContainerStarted","Data":"f08e956bc7988bfb07d3e7137c46415a2b7f2f74c237596c0d6bbbd9fc1695bd"} Apr 17 16:53:54.971430 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:54.971393 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-jh5wg" podStartSLOduration=1.026701422 podStartE2EDuration="3.971382716s" podCreationTimestamp="2026-04-17 16:53:51 +0000 UTC" firstStartedPulling="2026-04-17 16:53:51.5794196 +0000 UTC m=+76.486304395" lastFinishedPulling="2026-04-17 16:53:54.524100893 +0000 UTC m=+79.430985689" observedRunningTime="2026-04-17 16:53:54.969657228 +0000 UTC m=+79.876542044" watchObservedRunningTime="2026-04-17 16:53:54.971382716 +0000 UTC m=+79.878267510" Apr 17 16:53:54.986198 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:54.986147 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-js8f5" podStartSLOduration=33.17290005 podStartE2EDuration="34.986130935s" podCreationTimestamp="2026-04-17 16:53:20 +0000 UTC" firstStartedPulling="2026-04-17 16:53:52.710275639 +0000 UTC m=+77.617160436" lastFinishedPulling="2026-04-17 16:53:54.523506522 +0000 UTC m=+79.430391321" observedRunningTime="2026-04-17 16:53:54.985151428 +0000 UTC m=+79.892036244" watchObservedRunningTime="2026-04-17 16:53:54.986130935 +0000 UTC m=+79.893015752" Apr 17 16:53:55.001646 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:55.001605 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" podStartSLOduration=33.312832965 podStartE2EDuration="35.001593152s" podCreationTimestamp="2026-04-17 16:53:20 +0000 UTC" firstStartedPulling="2026-04-17 16:53:52.83996184 +0000 UTC m=+77.746846640" lastFinishedPulling="2026-04-17 16:53:54.528722031 +0000 UTC m=+79.435606827" observedRunningTime="2026-04-17 16:53:55.000446252 +0000 UTC m=+79.907331106" watchObservedRunningTime="2026-04-17 16:53:55.001593152 +0000 UTC m=+79.908477968" Apr 17 16:53:55.036783 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:55.036754 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q59r6"] Apr 17 16:53:55.040119 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:55.040103 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q59r6" Apr 17 16:53:55.042652 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:55.042633 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 16:53:55.042747 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:55.042718 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-2gdpv\"" Apr 17 16:53:55.048256 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:55.048236 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q59r6"] Apr 17 16:53:55.153688 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:55.153653 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/2ccace76-e6f7-4577-8d52-ce5ff0fb350e-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-q59r6\" (UID: \"2ccace76-e6f7-4577-8d52-ce5ff0fb350e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q59r6" Apr 17 16:53:55.254996 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:55.254963 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/2ccace76-e6f7-4577-8d52-ce5ff0fb350e-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-q59r6\" (UID: \"2ccace76-e6f7-4577-8d52-ce5ff0fb350e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q59r6" Apr 17 16:53:55.255167 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:55.255081 2568 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 17 16:53:55.255167 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:55.255136 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ccace76-e6f7-4577-8d52-ce5ff0fb350e-tls-certificates podName:2ccace76-e6f7-4577-8d52-ce5ff0fb350e nodeName:}" failed. No retries permitted until 2026-04-17 16:53:55.755119957 +0000 UTC m=+80.662004750 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/2ccace76-e6f7-4577-8d52-ce5ff0fb350e-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-q59r6" (UID: "2ccace76-e6f7-4577-8d52-ce5ff0fb350e") : secret "prometheus-operator-admission-webhook-tls" not found Apr 17 16:53:55.759178 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:55.759143 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/2ccace76-e6f7-4577-8d52-ce5ff0fb350e-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-q59r6\" (UID: \"2ccace76-e6f7-4577-8d52-ce5ff0fb350e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q59r6" Apr 17 16:53:55.761490 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:55.761457 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/2ccace76-e6f7-4577-8d52-ce5ff0fb350e-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-q59r6\" (UID: \"2ccace76-e6f7-4577-8d52-ce5ff0fb350e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q59r6" Apr 17 16:53:55.949932 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:55.949899 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q59r6" Apr 17 16:53:56.063859 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.063831 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q59r6"] Apr 17 16:53:56.066959 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:53:56.066933 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ccace76_e6f7_4577_8d52_ce5ff0fb350e.slice/crio-c1a3314568e0fb6827ea6559ba3290eb176554abf874cfcbe59f70ed5a481924 WatchSource:0}: Error finding container c1a3314568e0fb6827ea6559ba3290eb176554abf874cfcbe59f70ed5a481924: Status 404 returned error can't find the container with id c1a3314568e0fb6827ea6559ba3290eb176554abf874cfcbe59f70ed5a481924 Apr 17 16:53:56.628203 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.628173 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7d7978c455-g5j8f"] Apr 17 16:53:56.631196 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.631173 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:53:56.634400 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.634377 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 16:53:56.635510 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.635491 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-h7h8x\"" Apr 17 16:53:56.635613 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.635527 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 16:53:56.635674 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.635614 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 16:53:56.635674 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.635622 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 16:53:56.635674 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.635495 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 16:53:56.635826 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.635734 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 16:53:56.635959 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.635942 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 16:53:56.644602 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.644573 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d7978c455-g5j8f"] Apr 17 16:53:56.766291 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.766246 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-console-config\") pod \"console-7d7978c455-g5j8f\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:53:56.766291 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.766301 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-oauth-serving-cert\") pod \"console-7d7978c455-g5j8f\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:53:56.766497 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.766328 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-console-oauth-config\") pod \"console-7d7978c455-g5j8f\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:53:56.766497 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.766344 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhlmc\" (UniqueName: \"kubernetes.io/projected/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-kube-api-access-fhlmc\") pod \"console-7d7978c455-g5j8f\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:53:56.766497 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.766369 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-service-ca\") pod \"console-7d7978c455-g5j8f\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:53:56.766497 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.766442 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-console-serving-cert\") pod \"console-7d7978c455-g5j8f\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:53:56.867630 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.867574 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-console-config\") pod \"console-7d7978c455-g5j8f\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:53:56.867630 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.867635 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-oauth-serving-cert\") pod \"console-7d7978c455-g5j8f\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:53:56.867847 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.867734 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-console-oauth-config\") pod \"console-7d7978c455-g5j8f\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:53:56.867847 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.867754 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhlmc\" (UniqueName: \"kubernetes.io/projected/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-kube-api-access-fhlmc\") pod \"console-7d7978c455-g5j8f\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:53:56.867847 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.867783 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-service-ca\") pod \"console-7d7978c455-g5j8f\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:53:56.867847 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.867804 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-console-serving-cert\") pod \"console-7d7978c455-g5j8f\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:53:56.868366 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.868341 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-console-config\") pod \"console-7d7978c455-g5j8f\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:53:56.868461 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.868338 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-oauth-serving-cert\") pod \"console-7d7978c455-g5j8f\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:53:56.868515 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.868492 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-service-ca\") pod \"console-7d7978c455-g5j8f\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:53:56.870130 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.870109 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-console-oauth-config\") pod \"console-7d7978c455-g5j8f\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:53:56.876865 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.876811 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-console-serving-cert\") pod \"console-7d7978c455-g5j8f\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:53:56.879156 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.879076 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhlmc\" (UniqueName: \"kubernetes.io/projected/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-kube-api-access-fhlmc\") pod \"console-7d7978c455-g5j8f\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:53:56.943205 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.943171 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:53:56.960493 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:56.960461 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q59r6" event={"ID":"2ccace76-e6f7-4577-8d52-ce5ff0fb350e","Type":"ContainerStarted","Data":"c1a3314568e0fb6827ea6559ba3290eb176554abf874cfcbe59f70ed5a481924"} Apr 17 16:53:57.237138 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:57.236777 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d7978c455-g5j8f"] Apr 17 16:53:57.241066 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:53:57.241035 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dc4fa69_c69e_48f9_8c29_6b43c3163ddf.slice/crio-f3ee457743a21d50a987199676207013ad9afcc3e99f1e736d9d50bb0ba2bd59 WatchSource:0}: Error finding container f3ee457743a21d50a987199676207013ad9afcc3e99f1e736d9d50bb0ba2bd59: Status 404 returned error can't find the container with id f3ee457743a21d50a987199676207013ad9afcc3e99f1e736d9d50bb0ba2bd59 Apr 17 16:53:57.966383 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:57.966342 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q59r6" event={"ID":"2ccace76-e6f7-4577-8d52-ce5ff0fb350e","Type":"ContainerStarted","Data":"bd4be2b5098e3b2d76c7b4c1d15d10aba9969903ed8e1603e03e045ea758dd43"} Apr 17 16:53:57.966622 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:57.966565 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q59r6" Apr 17 16:53:57.967727 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:57.967696 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d7978c455-g5j8f" event={"ID":"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf","Type":"ContainerStarted","Data":"f3ee457743a21d50a987199676207013ad9afcc3e99f1e736d9d50bb0ba2bd59"} Apr 17 16:53:57.972713 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:57.972670 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q59r6" Apr 17 16:53:57.981136 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:57.981086 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-q59r6" podStartSLOduration=1.889619319 podStartE2EDuration="2.981073947s" podCreationTimestamp="2026-04-17 16:53:55 +0000 UTC" firstStartedPulling="2026-04-17 16:53:56.068914606 +0000 UTC m=+80.975799400" lastFinishedPulling="2026-04-17 16:53:57.1603692 +0000 UTC m=+82.067254028" observedRunningTime="2026-04-17 16:53:57.980820244 +0000 UTC m=+82.887705061" watchObservedRunningTime="2026-04-17 16:53:57.981073947 +0000 UTC m=+82.887958763" Apr 17 16:53:58.091542 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:58.091504 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-f8m8f"] Apr 17 16:53:58.094852 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:58.094825 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-f8m8f" Apr 17 16:53:58.097637 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:58.097599 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 16:53:58.097796 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:58.097663 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 16:53:58.097796 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:58.097606 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-nhg8q\"" Apr 17 16:53:58.097796 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:58.097609 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 16:53:58.104537 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:58.104514 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-f8m8f"] Apr 17 16:53:58.179153 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:58.179122 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14a9d6ea-9985-48e3-969f-8f976c314970-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-f8m8f\" (UID: \"14a9d6ea-9985-48e3-969f-8f976c314970\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-f8m8f" Apr 17 16:53:58.179326 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:58.179232 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xmc5\" (UniqueName: \"kubernetes.io/projected/14a9d6ea-9985-48e3-969f-8f976c314970-kube-api-access-7xmc5\") pod \"prometheus-operator-5676c8c784-f8m8f\" (UID: \"14a9d6ea-9985-48e3-969f-8f976c314970\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-f8m8f" Apr 17 16:53:58.179372 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:58.179336 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/14a9d6ea-9985-48e3-969f-8f976c314970-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-f8m8f\" (UID: \"14a9d6ea-9985-48e3-969f-8f976c314970\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-f8m8f" Apr 17 16:53:58.179413 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:58.179388 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14a9d6ea-9985-48e3-969f-8f976c314970-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-f8m8f\" (UID: \"14a9d6ea-9985-48e3-969f-8f976c314970\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-f8m8f" Apr 17 16:53:58.280466 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:58.280428 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/14a9d6ea-9985-48e3-969f-8f976c314970-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-f8m8f\" (UID: \"14a9d6ea-9985-48e3-969f-8f976c314970\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-f8m8f" Apr 17 16:53:58.280988 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:58.280482 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14a9d6ea-9985-48e3-969f-8f976c314970-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-f8m8f\" (UID: \"14a9d6ea-9985-48e3-969f-8f976c314970\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-f8m8f" Apr 17 16:53:58.280988 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:58.280517 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14a9d6ea-9985-48e3-969f-8f976c314970-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-f8m8f\" (UID: \"14a9d6ea-9985-48e3-969f-8f976c314970\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-f8m8f" Apr 17 16:53:58.280988 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:58.280604 2568 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 17 16:53:58.280988 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:58.280634 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xmc5\" (UniqueName: \"kubernetes.io/projected/14a9d6ea-9985-48e3-969f-8f976c314970-kube-api-access-7xmc5\") pod \"prometheus-operator-5676c8c784-f8m8f\" (UID: \"14a9d6ea-9985-48e3-969f-8f976c314970\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-f8m8f" Apr 17 16:53:58.280988 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:53:58.280685 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14a9d6ea-9985-48e3-969f-8f976c314970-prometheus-operator-tls podName:14a9d6ea-9985-48e3-969f-8f976c314970 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:58.78066117 +0000 UTC m=+83.687545977 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/14a9d6ea-9985-48e3-969f-8f976c314970-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-f8m8f" (UID: "14a9d6ea-9985-48e3-969f-8f976c314970") : secret "prometheus-operator-tls" not found Apr 17 16:53:58.281404 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:58.281361 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14a9d6ea-9985-48e3-969f-8f976c314970-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-f8m8f\" (UID: \"14a9d6ea-9985-48e3-969f-8f976c314970\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-f8m8f" Apr 17 16:53:58.283415 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:58.283388 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14a9d6ea-9985-48e3-969f-8f976c314970-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-f8m8f\" (UID: \"14a9d6ea-9985-48e3-969f-8f976c314970\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-f8m8f" Apr 17 16:53:58.292633 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:58.292590 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xmc5\" (UniqueName: \"kubernetes.io/projected/14a9d6ea-9985-48e3-969f-8f976c314970-kube-api-access-7xmc5\") pod \"prometheus-operator-5676c8c784-f8m8f\" (UID: \"14a9d6ea-9985-48e3-969f-8f976c314970\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-f8m8f" Apr 17 16:53:58.785389 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:58.785353 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/14a9d6ea-9985-48e3-969f-8f976c314970-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-f8m8f\" (UID: \"14a9d6ea-9985-48e3-969f-8f976c314970\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-f8m8f" Apr 17 16:53:58.788118 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:58.788089 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/14a9d6ea-9985-48e3-969f-8f976c314970-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-f8m8f\" (UID: \"14a9d6ea-9985-48e3-969f-8f976c314970\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-f8m8f" Apr 17 16:53:59.007502 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:59.007470 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-f8m8f" Apr 17 16:53:59.941579 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:59.941558 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-f8m8f"] Apr 17 16:53:59.944056 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:53:59.944034 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14a9d6ea_9985_48e3_969f_8f976c314970.slice/crio-2f3f121fded55be29ca47d0c59ff815022d584dc29f28534c905afe0236276d8 WatchSource:0}: Error finding container 2f3f121fded55be29ca47d0c59ff815022d584dc29f28534c905afe0236276d8: Status 404 returned error can't find the container with id 2f3f121fded55be29ca47d0c59ff815022d584dc29f28534c905afe0236276d8 Apr 17 16:53:59.975871 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:59.975837 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d7978c455-g5j8f" event={"ID":"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf","Type":"ContainerStarted","Data":"8be2b9f02df019e3352f41d29675c15efbcea28fedf41250b673d9148fba4621"} Apr 17 16:53:59.976975 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:59.976949 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-f8m8f" event={"ID":"14a9d6ea-9985-48e3-969f-8f976c314970","Type":"ContainerStarted","Data":"2f3f121fded55be29ca47d0c59ff815022d584dc29f28534c905afe0236276d8"} Apr 17 16:53:59.993835 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:53:59.993792 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7d7978c455-g5j8f" podStartSLOduration=1.353533808 podStartE2EDuration="3.993777779s" podCreationTimestamp="2026-04-17 16:53:56 +0000 UTC" firstStartedPulling="2026-04-17 16:53:57.24318371 +0000 UTC m=+82.150068504" lastFinishedPulling="2026-04-17 16:53:59.883427681 +0000 UTC m=+84.790312475" observedRunningTime="2026-04-17 16:53:59.992333551 +0000 UTC m=+84.899218368" watchObservedRunningTime="2026-04-17 16:53:59.993777779 +0000 UTC m=+84.900662594" Apr 17 16:54:01.018956 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:01.018911 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:54:01.984246 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:01.984207 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-f8m8f" event={"ID":"14a9d6ea-9985-48e3-969f-8f976c314970","Type":"ContainerStarted","Data":"2be7f53881377987d8741530de2ee91d951f0584330715503f3678e0e46e20e5"} Apr 17 16:54:01.984246 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:01.984244 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-f8m8f" event={"ID":"14a9d6ea-9985-48e3-969f-8f976c314970","Type":"ContainerStarted","Data":"2119ab8e565a4fddf7390238f6f62a82642b17244092fa76224edc9eb423cce5"} Apr 17 16:54:01.999819 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:01.999762 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-f8m8f" podStartSLOduration=2.839862794 podStartE2EDuration="3.999747111s" podCreationTimestamp="2026-04-17 16:53:58 +0000 UTC" firstStartedPulling="2026-04-17 16:53:59.946435349 +0000 UTC m=+84.853320155" lastFinishedPulling="2026-04-17 16:54:01.106319678 +0000 UTC m=+86.013204472" observedRunningTime="2026-04-17 16:54:01.999579724 +0000 UTC m=+86.906464540" watchObservedRunningTime="2026-04-17 16:54:01.999747111 +0000 UTC m=+86.906631927" Apr 17 16:54:03.435678 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.435591 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx"] Apr 17 16:54:03.439808 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.439778 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-64ttp"] Apr 17 16:54:03.440000 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.439976 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx" Apr 17 16:54:03.442517 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.442497 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 16:54:03.442668 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.442498 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 16:54:03.442743 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.442556 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-scn25\"" Apr 17 16:54:03.443985 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.443968 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" Apr 17 16:54:03.446878 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.446591 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 16:54:03.446878 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.446599 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 16:54:03.446878 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.446746 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-pbnl9\"" Apr 17 16:54:03.446878 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.446775 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 16:54:03.458827 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.455997 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx"] Apr 17 16:54:03.458962 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.458824 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-64ttp"] Apr 17 16:54:03.461608 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.460202 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zpgj6"] Apr 17 16:54:03.469559 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.465128 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.469559 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.468094 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 16:54:03.469559 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.468228 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 16:54:03.469559 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.468340 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-z9mwd\"" Apr 17 16:54:03.469559 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.468568 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 16:54:03.623386 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.623348 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e38f13c7-0aae-46bc-9fad-7df2e1de3aa2-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-55zhx\" (UID: \"e38f13c7-0aae-46bc-9fad-7df2e1de3aa2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx" Apr 17 16:54:03.623386 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.623387 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgvlq\" (UniqueName: \"kubernetes.io/projected/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-kube-api-access-cgvlq\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.623639 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.623415 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjh4x\" (UniqueName: \"kubernetes.io/projected/69985995-b454-4287-8e16-ad76e4c4e3e3-kube-api-access-jjh4x\") pod \"kube-state-metrics-69db897b98-64ttp\" (UID: \"69985995-b454-4287-8e16-ad76e4c4e3e3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" Apr 17 16:54:03.623639 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.623446 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-sys\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.623639 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.623466 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-node-exporter-tls\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.623639 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.623484 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e38f13c7-0aae-46bc-9fad-7df2e1de3aa2-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-55zhx\" (UID: \"e38f13c7-0aae-46bc-9fad-7df2e1de3aa2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx" Apr 17 16:54:03.623639 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.623514 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkkbv\" (UniqueName: \"kubernetes.io/projected/e38f13c7-0aae-46bc-9fad-7df2e1de3aa2-kube-api-access-gkkbv\") pod \"openshift-state-metrics-9d44df66c-55zhx\" (UID: \"e38f13c7-0aae-46bc-9fad-7df2e1de3aa2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx" Apr 17 16:54:03.623639 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.623569 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/69985995-b454-4287-8e16-ad76e4c4e3e3-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-64ttp\" (UID: \"69985995-b454-4287-8e16-ad76e4c4e3e3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" Apr 17 16:54:03.623639 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.623602 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-root\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.623639 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.623622 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-node-exporter-accelerators-collector-config\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.624052 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.623651 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-node-exporter-textfile\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.624052 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.623672 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/69985995-b454-4287-8e16-ad76e4c4e3e3-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-64ttp\" (UID: \"69985995-b454-4287-8e16-ad76e4c4e3e3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" Apr 17 16:54:03.624052 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.623689 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-metrics-client-ca\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.624052 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.623720 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e38f13c7-0aae-46bc-9fad-7df2e1de3aa2-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-55zhx\" (UID: \"e38f13c7-0aae-46bc-9fad-7df2e1de3aa2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx" Apr 17 16:54:03.624052 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.623742 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.624052 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.623790 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/69985995-b454-4287-8e16-ad76e4c4e3e3-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-64ttp\" (UID: \"69985995-b454-4287-8e16-ad76e4c4e3e3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" Apr 17 16:54:03.624052 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.623815 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/69985995-b454-4287-8e16-ad76e4c4e3e3-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-64ttp\" (UID: \"69985995-b454-4287-8e16-ad76e4c4e3e3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" Apr 17 16:54:03.624052 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.623832 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69985995-b454-4287-8e16-ad76e4c4e3e3-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-64ttp\" (UID: \"69985995-b454-4287-8e16-ad76e4c4e3e3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" Apr 17 16:54:03.624052 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.623853 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-node-exporter-wtmp\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.724342 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.724314 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-node-exporter-textfile\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.724511 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.724352 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/69985995-b454-4287-8e16-ad76e4c4e3e3-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-64ttp\" (UID: \"69985995-b454-4287-8e16-ad76e4c4e3e3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" Apr 17 16:54:03.724511 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.724371 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-metrics-client-ca\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.724511 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.724392 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e38f13c7-0aae-46bc-9fad-7df2e1de3aa2-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-55zhx\" (UID: \"e38f13c7-0aae-46bc-9fad-7df2e1de3aa2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx" Apr 17 16:54:03.724511 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.724417 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.724511 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.724490 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/69985995-b454-4287-8e16-ad76e4c4e3e3-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-64ttp\" (UID: \"69985995-b454-4287-8e16-ad76e4c4e3e3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" Apr 17 16:54:03.724773 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.724543 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/69985995-b454-4287-8e16-ad76e4c4e3e3-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-64ttp\" (UID: \"69985995-b454-4287-8e16-ad76e4c4e3e3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" Apr 17 16:54:03.724773 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.724580 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69985995-b454-4287-8e16-ad76e4c4e3e3-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-64ttp\" (UID: \"69985995-b454-4287-8e16-ad76e4c4e3e3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" Apr 17 16:54:03.724773 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.724624 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-node-exporter-wtmp\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.724773 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.724658 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e38f13c7-0aae-46bc-9fad-7df2e1de3aa2-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-55zhx\" (UID: \"e38f13c7-0aae-46bc-9fad-7df2e1de3aa2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx" Apr 17 16:54:03.724773 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.724693 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-node-exporter-textfile\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.724773 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.724705 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cgvlq\" (UniqueName: \"kubernetes.io/projected/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-kube-api-access-cgvlq\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.724773 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.724763 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjh4x\" (UniqueName: \"kubernetes.io/projected/69985995-b454-4287-8e16-ad76e4c4e3e3-kube-api-access-jjh4x\") pod \"kube-state-metrics-69db897b98-64ttp\" (UID: \"69985995-b454-4287-8e16-ad76e4c4e3e3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" Apr 17 16:54:03.725110 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.724807 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-sys\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.725110 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.724835 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-node-exporter-tls\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.725110 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.724866 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e38f13c7-0aae-46bc-9fad-7df2e1de3aa2-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-55zhx\" (UID: \"e38f13c7-0aae-46bc-9fad-7df2e1de3aa2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx" Apr 17 16:54:03.725110 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.724910 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkkbv\" (UniqueName: \"kubernetes.io/projected/e38f13c7-0aae-46bc-9fad-7df2e1de3aa2-kube-api-access-gkkbv\") pod \"openshift-state-metrics-9d44df66c-55zhx\" (UID: \"e38f13c7-0aae-46bc-9fad-7df2e1de3aa2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx" Apr 17 16:54:03.725110 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.724975 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/69985995-b454-4287-8e16-ad76e4c4e3e3-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-64ttp\" (UID: \"69985995-b454-4287-8e16-ad76e4c4e3e3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" Apr 17 16:54:03.725110 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.725003 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-root\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.725110 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.725037 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-node-exporter-accelerators-collector-config\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.725449 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.725198 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-node-exporter-wtmp\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.725449 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.725303 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-metrics-client-ca\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.725584 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.725560 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-sys\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.725643 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.725588 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-node-exporter-accelerators-collector-config\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.725696 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.725654 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e38f13c7-0aae-46bc-9fad-7df2e1de3aa2-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-55zhx\" (UID: \"e38f13c7-0aae-46bc-9fad-7df2e1de3aa2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx" Apr 17 16:54:03.725770 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.725752 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-root\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.725988 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:54:03.725851 2568 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 17 16:54:03.725988 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.725901 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/69985995-b454-4287-8e16-ad76e4c4e3e3-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-64ttp\" (UID: \"69985995-b454-4287-8e16-ad76e4c4e3e3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" Apr 17 16:54:03.725988 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:54:03.725950 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e38f13c7-0aae-46bc-9fad-7df2e1de3aa2-openshift-state-metrics-tls podName:e38f13c7-0aae-46bc-9fad-7df2e1de3aa2 nodeName:}" failed. No retries permitted until 2026-04-17 16:54:04.22591368 +0000 UTC m=+89.132798488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/e38f13c7-0aae-46bc-9fad-7df2e1de3aa2-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-55zhx" (UID: "e38f13c7-0aae-46bc-9fad-7df2e1de3aa2") : secret "openshift-state-metrics-tls" not found Apr 17 16:54:03.726272 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.726070 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69985995-b454-4287-8e16-ad76e4c4e3e3-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-64ttp\" (UID: \"69985995-b454-4287-8e16-ad76e4c4e3e3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" Apr 17 16:54:03.726272 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.726212 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/69985995-b454-4287-8e16-ad76e4c4e3e3-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-64ttp\" (UID: \"69985995-b454-4287-8e16-ad76e4c4e3e3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" Apr 17 16:54:03.727498 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.727474 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/69985995-b454-4287-8e16-ad76e4c4e3e3-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-64ttp\" (UID: \"69985995-b454-4287-8e16-ad76e4c4e3e3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" Apr 17 16:54:03.728012 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.727989 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e38f13c7-0aae-46bc-9fad-7df2e1de3aa2-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-55zhx\" (UID: \"e38f13c7-0aae-46bc-9fad-7df2e1de3aa2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx" Apr 17 16:54:03.728302 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.728276 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.728399 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.728286 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/69985995-b454-4287-8e16-ad76e4c4e3e3-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-64ttp\" (UID: \"69985995-b454-4287-8e16-ad76e4c4e3e3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" Apr 17 16:54:03.728399 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.728380 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-node-exporter-tls\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.732731 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.732710 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgvlq\" (UniqueName: \"kubernetes.io/projected/8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c-kube-api-access-cgvlq\") pod \"node-exporter-zpgj6\" (UID: \"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c\") " pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.734635 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.734618 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkkbv\" (UniqueName: \"kubernetes.io/projected/e38f13c7-0aae-46bc-9fad-7df2e1de3aa2-kube-api-access-gkkbv\") pod \"openshift-state-metrics-9d44df66c-55zhx\" (UID: \"e38f13c7-0aae-46bc-9fad-7df2e1de3aa2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx" Apr 17 16:54:03.735033 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.735014 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjh4x\" (UniqueName: \"kubernetes.io/projected/69985995-b454-4287-8e16-ad76e4c4e3e3-kube-api-access-jjh4x\") pod \"kube-state-metrics-69db897b98-64ttp\" (UID: \"69985995-b454-4287-8e16-ad76e4c4e3e3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" Apr 17 16:54:03.770784 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.770758 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" Apr 17 16:54:03.782470 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.782443 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zpgj6" Apr 17 16:54:03.792876 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:54:03.792844 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cbfc4c3_65fb_4b43_a0f8_4b26ca71366c.slice/crio-74a43d3afd57cc315a4ed608706aacd646eed5ed156dda5eb32a6f18f054d522 WatchSource:0}: Error finding container 74a43d3afd57cc315a4ed608706aacd646eed5ed156dda5eb32a6f18f054d522: Status 404 returned error can't find the container with id 74a43d3afd57cc315a4ed608706aacd646eed5ed156dda5eb32a6f18f054d522 Apr 17 16:54:03.908253 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.908101 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-64ttp"] Apr 17 16:54:03.910229 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:54:03.910199 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69985995_b454_4287_8e16_ad76e4c4e3e3.slice/crio-720c985814cdc73d98ce67ff0cecb3599baebbf26e4c939fe56965d46ec3b74d WatchSource:0}: Error finding container 720c985814cdc73d98ce67ff0cecb3599baebbf26e4c939fe56965d46ec3b74d: Status 404 returned error can't find the container with id 720c985814cdc73d98ce67ff0cecb3599baebbf26e4c939fe56965d46ec3b74d Apr 17 16:54:03.991666 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.991579 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zpgj6" event={"ID":"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c","Type":"ContainerStarted","Data":"74a43d3afd57cc315a4ed608706aacd646eed5ed156dda5eb32a6f18f054d522"} Apr 17 16:54:03.992704 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:03.992682 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" event={"ID":"69985995-b454-4287-8e16-ad76e4c4e3e3","Type":"ContainerStarted","Data":"720c985814cdc73d98ce67ff0cecb3599baebbf26e4c939fe56965d46ec3b74d"} Apr 17 16:54:04.230183 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:04.230146 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e38f13c7-0aae-46bc-9fad-7df2e1de3aa2-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-55zhx\" (UID: \"e38f13c7-0aae-46bc-9fad-7df2e1de3aa2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx" Apr 17 16:54:04.232553 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:04.232527 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e38f13c7-0aae-46bc-9fad-7df2e1de3aa2-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-55zhx\" (UID: \"e38f13c7-0aae-46bc-9fad-7df2e1de3aa2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx" Apr 17 16:54:04.362882 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:04.362790 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx" Apr 17 16:54:04.515494 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:04.515466 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx"] Apr 17 16:54:04.674413 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:54:04.674340 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode38f13c7_0aae_46bc_9fad_7df2e1de3aa2.slice/crio-8c9afc139f6a3dd1e89ec5e1ad5d41e8dc1e011ca1a10d3b72765be78d35082e WatchSource:0}: Error finding container 8c9afc139f6a3dd1e89ec5e1ad5d41e8dc1e011ca1a10d3b72765be78d35082e: Status 404 returned error can't find the container with id 8c9afc139f6a3dd1e89ec5e1ad5d41e8dc1e011ca1a10d3b72765be78d35082e Apr 17 16:54:04.996883 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:04.996841 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zpgj6" event={"ID":"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c","Type":"ContainerStarted","Data":"c80b217e8db672946bbce9ebfd5a234370bb24247af82f53717b148f29a995ab"} Apr 17 16:54:04.998450 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:04.998425 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx" event={"ID":"e38f13c7-0aae-46bc-9fad-7df2e1de3aa2","Type":"ContainerStarted","Data":"c6b1777b6366e15211c9b0c859971cda9f9808aea5d885b247122e3e82dd6ee0"} Apr 17 16:54:04.998450 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:04.998453 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx" event={"ID":"e38f13c7-0aae-46bc-9fad-7df2e1de3aa2","Type":"ContainerStarted","Data":"ee3130c4a0ae254c6145b8262cb835bc73bfae83cbb090b8c7835198867a559c"} Apr 17 16:54:04.998624 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:04.998463 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx" event={"ID":"e38f13c7-0aae-46bc-9fad-7df2e1de3aa2","Type":"ContainerStarted","Data":"8c9afc139f6a3dd1e89ec5e1ad5d41e8dc1e011ca1a10d3b72765be78d35082e"} Apr 17 16:54:06.002830 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.002799 2568 generic.go:358] "Generic (PLEG): container finished" podID="8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c" containerID="c80b217e8db672946bbce9ebfd5a234370bb24247af82f53717b148f29a995ab" exitCode=0 Apr 17 16:54:06.003238 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.002883 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zpgj6" event={"ID":"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c","Type":"ContainerDied","Data":"c80b217e8db672946bbce9ebfd5a234370bb24247af82f53717b148f29a995ab"} Apr 17 16:54:06.004948 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.004906 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" event={"ID":"69985995-b454-4287-8e16-ad76e4c4e3e3","Type":"ContainerStarted","Data":"369abccec4435ac7b135f3340fc818fc2b9445c7f9d1fc9959aefed09294a0d5"} Apr 17 16:54:06.005034 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.004955 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" event={"ID":"69985995-b454-4287-8e16-ad76e4c4e3e3","Type":"ContainerStarted","Data":"3afd886ad96108e44da47ed059f084401579c6eb7f9721b04f211f05313f9f3d"} Apr 17 16:54:06.005034 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.004964 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" event={"ID":"69985995-b454-4287-8e16-ad76e4c4e3e3","Type":"ContainerStarted","Data":"c9f37107538839e683ea3d63a560baf12c394ac1e2ecbe784eeac89446e628fb"} Apr 17 16:54:06.043687 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.043620 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-64ttp" podStartSLOduration=1.583265358 podStartE2EDuration="3.043601745s" podCreationTimestamp="2026-04-17 16:54:03 +0000 UTC" firstStartedPulling="2026-04-17 16:54:03.912210037 +0000 UTC m=+88.819094832" lastFinishedPulling="2026-04-17 16:54:05.372546424 +0000 UTC m=+90.279431219" observedRunningTime="2026-04-17 16:54:06.042947975 +0000 UTC m=+90.949832781" watchObservedRunningTime="2026-04-17 16:54:06.043601745 +0000 UTC m=+90.950486563" Apr 17 16:54:06.528276 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.528239 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-757c9669d9-7849l"] Apr 17 16:54:06.532253 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.532231 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.535846 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.535821 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 16:54:06.536273 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.536251 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 16:54:06.536364 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.536300 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-kccrl\"" Apr 17 16:54:06.536429 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.536405 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 16:54:06.536482 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.536425 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 16:54:06.536584 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.536566 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 16:54:06.536667 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.536586 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-29migf2ai9o96\"" Apr 17 16:54:06.551224 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.550879 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-757c9669d9-7849l"] Apr 17 16:54:06.652590 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.652558 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwqzp\" (UniqueName: \"kubernetes.io/projected/84be4b0e-4b66-4061-a3d5-3a8708d4255b-kube-api-access-mwqzp\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.652826 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.652614 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/84be4b0e-4b66-4061-a3d5-3a8708d4255b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.652826 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.652645 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84be4b0e-4b66-4061-a3d5-3a8708d4255b-metrics-client-ca\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.652826 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.652690 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/84be4b0e-4b66-4061-a3d5-3a8708d4255b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.652826 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.652730 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/84be4b0e-4b66-4061-a3d5-3a8708d4255b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.652826 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.652760 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/84be4b0e-4b66-4061-a3d5-3a8708d4255b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.652826 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.652815 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/84be4b0e-4b66-4061-a3d5-3a8708d4255b-secret-thanos-querier-tls\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.653050 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.652844 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/84be4b0e-4b66-4061-a3d5-3a8708d4255b-secret-grpc-tls\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.753892 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.753855 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/84be4b0e-4b66-4061-a3d5-3a8708d4255b-secret-thanos-querier-tls\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.753892 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.753904 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/84be4b0e-4b66-4061-a3d5-3a8708d4255b-secret-grpc-tls\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.754168 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.754000 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mwqzp\" (UniqueName: \"kubernetes.io/projected/84be4b0e-4b66-4061-a3d5-3a8708d4255b-kube-api-access-mwqzp\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.754168 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.754033 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/84be4b0e-4b66-4061-a3d5-3a8708d4255b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.754168 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.754062 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84be4b0e-4b66-4061-a3d5-3a8708d4255b-metrics-client-ca\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.754168 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.754095 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/84be4b0e-4b66-4061-a3d5-3a8708d4255b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.754168 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.754153 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/84be4b0e-4b66-4061-a3d5-3a8708d4255b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.754505 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.754477 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/84be4b0e-4b66-4061-a3d5-3a8708d4255b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.754831 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.754782 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84be4b0e-4b66-4061-a3d5-3a8708d4255b-metrics-client-ca\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.757701 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.757648 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/84be4b0e-4b66-4061-a3d5-3a8708d4255b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.757832 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.757706 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/84be4b0e-4b66-4061-a3d5-3a8708d4255b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.757832 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.757756 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/84be4b0e-4b66-4061-a3d5-3a8708d4255b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.757995 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.757863 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/84be4b0e-4b66-4061-a3d5-3a8708d4255b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.757995 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.757948 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/84be4b0e-4b66-4061-a3d5-3a8708d4255b-secret-thanos-querier-tls\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.758087 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.758007 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/84be4b0e-4b66-4061-a3d5-3a8708d4255b-secret-grpc-tls\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.764991 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.764972 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwqzp\" (UniqueName: \"kubernetes.io/projected/84be4b0e-4b66-4061-a3d5-3a8708d4255b-kube-api-access-mwqzp\") pod \"thanos-querier-757c9669d9-7849l\" (UID: \"84be4b0e-4b66-4061-a3d5-3a8708d4255b\") " pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.842959 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.842859 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:06.943610 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.943578 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:54:06.943753 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.943629 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:54:06.949316 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.949065 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:54:06.975986 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:06.975939 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-757c9669d9-7849l"] Apr 17 16:54:06.978885 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:54:06.978858 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84be4b0e_4b66_4061_a3d5_3a8708d4255b.slice/crio-662668ebbdca6af946bb7694d4322f0eada90173ff01da8c2975ffca34b770f6 WatchSource:0}: Error finding container 662668ebbdca6af946bb7694d4322f0eada90173ff01da8c2975ffca34b770f6: Status 404 returned error can't find the container with id 662668ebbdca6af946bb7694d4322f0eada90173ff01da8c2975ffca34b770f6 Apr 17 16:54:07.014491 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:07.014458 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zpgj6" event={"ID":"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c","Type":"ContainerStarted","Data":"f452dc7b45d506ac3d8011b86a64fd40d2bbf73b89460447620a967c0c47ff49"} Apr 17 16:54:07.014491 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:07.014499 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zpgj6" event={"ID":"8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c","Type":"ContainerStarted","Data":"4af4e3c9e65f35c67f133cbfd9f98c01708ff4beff58f1274fd7c114745fc5ae"} Apr 17 16:54:07.016460 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:07.016433 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx" event={"ID":"e38f13c7-0aae-46bc-9fad-7df2e1de3aa2","Type":"ContainerStarted","Data":"e74d93445349dc3f215094b97a463126610e37894c2c221ef25c900fc666a2cd"} Apr 17 16:54:07.017509 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:07.017483 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" event={"ID":"84be4b0e-4b66-4061-a3d5-3a8708d4255b","Type":"ContainerStarted","Data":"662668ebbdca6af946bb7694d4322f0eada90173ff01da8c2975ffca34b770f6"} Apr 17 16:54:07.021427 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:07.021405 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:54:07.072634 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:07.072576 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zpgj6" podStartSLOduration=3.138947074 podStartE2EDuration="4.072562016s" podCreationTimestamp="2026-04-17 16:54:03 +0000 UTC" firstStartedPulling="2026-04-17 16:54:03.795616301 +0000 UTC m=+88.702501096" lastFinishedPulling="2026-04-17 16:54:04.72923123 +0000 UTC m=+89.636116038" observedRunningTime="2026-04-17 16:54:07.038831252 +0000 UTC m=+91.945716069" watchObservedRunningTime="2026-04-17 16:54:07.072562016 +0000 UTC m=+91.979446834" Apr 17 16:54:07.088486 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:07.088435 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-55zhx" podStartSLOduration=2.868558855 podStartE2EDuration="4.088417856s" podCreationTimestamp="2026-04-17 16:54:03 +0000 UTC" firstStartedPulling="2026-04-17 16:54:04.837640284 +0000 UTC m=+89.744525083" lastFinishedPulling="2026-04-17 16:54:06.057499275 +0000 UTC m=+90.964384084" observedRunningTime="2026-04-17 16:54:07.087505618 +0000 UTC m=+91.994390435" watchObservedRunningTime="2026-04-17 16:54:07.088417856 +0000 UTC m=+91.995302676" Apr 17 16:54:08.240752 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:08.240722 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-fqfhp"] Apr 17 16:54:08.245495 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:08.245470 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fqfhp" Apr 17 16:54:08.248307 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:08.248283 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-968wx\"" Apr 17 16:54:08.248431 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:08.248283 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 16:54:08.252965 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:08.252943 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-fqfhp"] Apr 17 16:54:08.268145 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:08.268087 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0062c6f7-d03d-47c2-b629-6c7c3639acb9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-fqfhp\" (UID: \"0062c6f7-d03d-47c2-b629-6c7c3639acb9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fqfhp" Apr 17 16:54:08.368897 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:08.368851 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0062c6f7-d03d-47c2-b629-6c7c3639acb9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-fqfhp\" (UID: \"0062c6f7-d03d-47c2-b629-6c7c3639acb9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fqfhp" Apr 17 16:54:08.369080 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:54:08.368991 2568 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 16:54:08.369080 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:54:08.369058 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0062c6f7-d03d-47c2-b629-6c7c3639acb9-monitoring-plugin-cert podName:0062c6f7-d03d-47c2-b629-6c7c3639acb9 nodeName:}" failed. No retries permitted until 2026-04-17 16:54:08.869038236 +0000 UTC m=+93.775923030 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/0062c6f7-d03d-47c2-b629-6c7c3639acb9-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-fqfhp" (UID: "0062c6f7-d03d-47c2-b629-6c7c3639acb9") : secret "monitoring-plugin-cert" not found Apr 17 16:54:08.873355 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:08.873255 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0062c6f7-d03d-47c2-b629-6c7c3639acb9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-fqfhp\" (UID: \"0062c6f7-d03d-47c2-b629-6c7c3639acb9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fqfhp" Apr 17 16:54:08.875827 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:08.875803 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0062c6f7-d03d-47c2-b629-6c7c3639acb9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-fqfhp\" (UID: \"0062c6f7-d03d-47c2-b629-6c7c3639acb9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fqfhp" Apr 17 16:54:09.025651 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.025623 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" event={"ID":"84be4b0e-4b66-4061-a3d5-3a8708d4255b","Type":"ContainerStarted","Data":"ba5440789a09584492c9073559ef5858a1c7186eed47fdd2ed8ddb29f193482f"} Apr 17 16:54:09.156543 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.156514 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fqfhp" Apr 17 16:54:09.273595 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.273572 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-fqfhp"] Apr 17 16:54:09.276033 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:54:09.276009 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0062c6f7_d03d_47c2_b629_6c7c3639acb9.slice/crio-754aa9bcf4fd79f944758ca19428ecc7d18d7fcb3c8d32257244e864b4a4817e WatchSource:0}: Error finding container 754aa9bcf4fd79f944758ca19428ecc7d18d7fcb3c8d32257244e864b4a4817e: Status 404 returned error can't find the container with id 754aa9bcf4fd79f944758ca19428ecc7d18d7fcb3c8d32257244e864b4a4817e Apr 17 16:54:09.667585 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.667543 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:54:09.676589 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.676561 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.679286 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.679262 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 16:54:09.680960 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.680694 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 16:54:09.680960 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.680804 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.680960 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.680841 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.680960 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.680896 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.680960 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.680945 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.681286 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.680980 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.681286 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.681005 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-config-out\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.681286 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.681069 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.681286 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.681108 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.681286 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.681136 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.681286 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.681160 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg984\" (UniqueName: \"kubernetes.io/projected/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-kube-api-access-sg984\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.681286 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.681187 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-config\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.681286 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.681219 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.681286 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.681272 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.681715 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.681295 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-web-config\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.681715 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.681327 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.681715 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.681348 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 16:54:09.681715 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.681481 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 16:54:09.681715 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.681352 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.681715 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.681574 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.681715 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.681620 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.682123 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.682002 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 16:54:09.682282 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.682260 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 16:54:09.682399 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.682321 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-tlgmm\"" Apr 17 16:54:09.682573 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.682460 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 16:54:09.682573 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.682477 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 16:54:09.682573 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.682542 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 16:54:09.682573 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.682558 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-11djqkr3n0v60\"" Apr 17 16:54:09.682817 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.682728 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 16:54:09.682817 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.682735 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 16:54:09.685837 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.685582 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 16:54:09.688860 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.687997 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 16:54:09.697642 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.697620 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:54:09.782175 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.782141 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.782359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.782183 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.782359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.782213 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.782359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.782237 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-config-out\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.782359 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.782261 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.782571 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.782387 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.782571 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.782434 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.782571 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.782459 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sg984\" (UniqueName: \"kubernetes.io/projected/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-kube-api-access-sg984\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.782571 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.782484 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-config\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.782571 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.782516 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.782571 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.782571 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.782861 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.782594 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-web-config\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.782861 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.782626 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.782861 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.782670 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.782861 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.782705 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.782861 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.782716 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.782861 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.782774 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.782861 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.782837 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.783224 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.782877 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.783273 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.783215 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.784199 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.783874 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.785958 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.785270 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.785958 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.785408 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-config-out\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.785958 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.785420 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.785958 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:54:09.785412 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-prometheus-trusted-ca-bundle podName:9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8 nodeName:}" failed. No retries permitted until 2026-04-17 16:54:10.28538947 +0000 UTC m=+95.192274264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8") : configmap references non-existent config key: ca-bundle.crt Apr 17 16:54:09.785958 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.785808 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.785958 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.785844 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.786357 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.785992 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.786664 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.786638 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.786664 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.786656 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.787134 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.787111 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.787848 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.787820 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-web-config\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.788007 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.787988 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.788249 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.788233 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.788294 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.788234 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-config\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:09.795124 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:09.795092 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg984\" (UniqueName: \"kubernetes.io/projected/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-kube-api-access-sg984\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:10.033994 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:10.033947 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" event={"ID":"84be4b0e-4b66-4061-a3d5-3a8708d4255b","Type":"ContainerStarted","Data":"5906acd85d3c9a1bc1699bcdd62c037657f24da1e788bd609251cf2a58af418c"} Apr 17 16:54:10.033994 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:10.033991 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" event={"ID":"84be4b0e-4b66-4061-a3d5-3a8708d4255b","Type":"ContainerStarted","Data":"11910ba6245d2e1d847d0e5046d116649b028cae0ad0fd8d5702c34b05334aeb"} Apr 17 16:54:10.035886 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:10.035853 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fqfhp" event={"ID":"0062c6f7-d03d-47c2-b629-6c7c3639acb9","Type":"ContainerStarted","Data":"754aa9bcf4fd79f944758ca19428ecc7d18d7fcb3c8d32257244e864b4a4817e"} Apr 17 16:54:10.288336 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:10.288149 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:10.289242 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:10.289171 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:10.291402 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:10.291219 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:10.432256 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:10.432223 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:54:10.661304 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:54:10.661230 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e6ab01b_20d1_430f_8cfa_98f6ffe0f7d8.slice/crio-25397a07207b747d51d81da58233732ed93e4266b67d76e126adaa662fc1291f WatchSource:0}: Error finding container 25397a07207b747d51d81da58233732ed93e4266b67d76e126adaa662fc1291f: Status 404 returned error can't find the container with id 25397a07207b747d51d81da58233732ed93e4266b67d76e126adaa662fc1291f Apr 17 16:54:11.041104 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:11.041056 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" event={"ID":"84be4b0e-4b66-4061-a3d5-3a8708d4255b","Type":"ContainerStarted","Data":"c93cd593853bdb9f98e2bcf46180f24f68921ec64c4dc24bc7958edc9ae597fe"} Apr 17 16:54:11.041104 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:11.041107 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" event={"ID":"84be4b0e-4b66-4061-a3d5-3a8708d4255b","Type":"ContainerStarted","Data":"3370b7432fa2f0da2189287df44b6f3155c764c1eac2cb7e263b7678aa25a9ae"} Apr 17 16:54:11.041322 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:11.041122 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" event={"ID":"84be4b0e-4b66-4061-a3d5-3a8708d4255b","Type":"ContainerStarted","Data":"58db096d4b2126750006d7243ece3791caeda661e524359b07fa703604df3df5"} Apr 17 16:54:11.041322 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:11.041269 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:11.042979 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:11.042952 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fqfhp" event={"ID":"0062c6f7-d03d-47c2-b629-6c7c3639acb9","Type":"ContainerStarted","Data":"79da6bf0984c6450e0e96d9c0c837cddb310a2522bef0750b1d33cbc0ca7a26b"} Apr 17 16:54:11.043119 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:11.043096 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fqfhp" Apr 17 16:54:11.044173 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:11.044153 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8","Type":"ContainerStarted","Data":"25397a07207b747d51d81da58233732ed93e4266b67d76e126adaa662fc1291f"} Apr 17 16:54:11.048156 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:11.048140 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fqfhp" Apr 17 16:54:11.064471 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:11.064422 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" podStartSLOduration=2.024948167 podStartE2EDuration="5.064412727s" podCreationTimestamp="2026-04-17 16:54:06 +0000 UTC" firstStartedPulling="2026-04-17 16:54:06.981119318 +0000 UTC m=+91.888004112" lastFinishedPulling="2026-04-17 16:54:10.020583861 +0000 UTC m=+94.927468672" observedRunningTime="2026-04-17 16:54:11.06313946 +0000 UTC m=+95.970024319" watchObservedRunningTime="2026-04-17 16:54:11.064412727 +0000 UTC m=+95.971297542" Apr 17 16:54:11.077800 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:11.077764 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fqfhp" podStartSLOduration=1.6476485570000001 podStartE2EDuration="3.077755032s" podCreationTimestamp="2026-04-17 16:54:08 +0000 UTC" firstStartedPulling="2026-04-17 16:54:09.277802407 +0000 UTC m=+94.184687201" lastFinishedPulling="2026-04-17 16:54:10.707908879 +0000 UTC m=+95.614793676" observedRunningTime="2026-04-17 16:54:11.077179668 +0000 UTC m=+95.984064484" watchObservedRunningTime="2026-04-17 16:54:11.077755032 +0000 UTC m=+95.984639848" Apr 17 16:54:12.048555 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:12.048524 2568 generic.go:358] "Generic (PLEG): container finished" podID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerID="bd24ffa4a7a8e8bde9da58fef6c3249660f60e8474397f71346d28d6efe41d2f" exitCode=0 Apr 17 16:54:12.048976 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:12.048630 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8","Type":"ContainerDied","Data":"bd24ffa4a7a8e8bde9da58fef6c3249660f60e8474397f71346d28d6efe41d2f"} Apr 17 16:54:14.060876 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.060840 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c655f6d99-4hfv9"] Apr 17 16:54:14.064576 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.064552 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.073939 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.073900 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 16:54:14.074621 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.074597 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c655f6d99-4hfv9"] Apr 17 16:54:14.125866 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.125835 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-service-ca\") pod \"console-c655f6d99-4hfv9\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.126055 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.125906 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/97a44237-c154-4a0b-b8ca-c4e5d84651c4-console-serving-cert\") pod \"console-c655f6d99-4hfv9\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.126055 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.125963 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-oauth-serving-cert\") pod \"console-c655f6d99-4hfv9\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.126055 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.126031 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-927tr\" (UniqueName: \"kubernetes.io/projected/97a44237-c154-4a0b-b8ca-c4e5d84651c4-kube-api-access-927tr\") pod \"console-c655f6d99-4hfv9\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.126221 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.126081 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-console-config\") pod \"console-c655f6d99-4hfv9\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.126221 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.126115 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/97a44237-c154-4a0b-b8ca-c4e5d84651c4-console-oauth-config\") pod \"console-c655f6d99-4hfv9\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.126329 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.126218 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-trusted-ca-bundle\") pod \"console-c655f6d99-4hfv9\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.227372 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.227326 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-service-ca\") pod \"console-c655f6d99-4hfv9\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.227543 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.227409 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/97a44237-c154-4a0b-b8ca-c4e5d84651c4-console-serving-cert\") pod \"console-c655f6d99-4hfv9\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.227543 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.227449 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-oauth-serving-cert\") pod \"console-c655f6d99-4hfv9\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.227543 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.227483 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-927tr\" (UniqueName: \"kubernetes.io/projected/97a44237-c154-4a0b-b8ca-c4e5d84651c4-kube-api-access-927tr\") pod \"console-c655f6d99-4hfv9\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.227543 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.227526 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-console-config\") pod \"console-c655f6d99-4hfv9\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.227767 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.227558 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/97a44237-c154-4a0b-b8ca-c4e5d84651c4-console-oauth-config\") pod \"console-c655f6d99-4hfv9\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.227767 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.227619 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-trusted-ca-bundle\") pod \"console-c655f6d99-4hfv9\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.228152 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.228122 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-service-ca\") pod \"console-c655f6d99-4hfv9\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.228461 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.228410 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-console-config\") pod \"console-c655f6d99-4hfv9\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.228581 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.228505 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-oauth-serving-cert\") pod \"console-c655f6d99-4hfv9\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.228581 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.228515 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-trusted-ca-bundle\") pod \"console-c655f6d99-4hfv9\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.230258 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.230230 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/97a44237-c154-4a0b-b8ca-c4e5d84651c4-console-serving-cert\") pod \"console-c655f6d99-4hfv9\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.230497 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.230479 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/97a44237-c154-4a0b-b8ca-c4e5d84651c4-console-oauth-config\") pod \"console-c655f6d99-4hfv9\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.236644 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.236608 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-927tr\" (UniqueName: \"kubernetes.io/projected/97a44237-c154-4a0b-b8ca-c4e5d84651c4-kube-api-access-927tr\") pod \"console-c655f6d99-4hfv9\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.376801 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.376710 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:14.843082 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:14.843061 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c655f6d99-4hfv9"] Apr 17 16:54:14.845817 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:54:14.845789 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97a44237_c154_4a0b_b8ca_c4e5d84651c4.slice/crio-d6f5a6eb2a4c1e365b91cee437f936707e06c100fd64fe382db2538dd57dc360 WatchSource:0}: Error finding container d6f5a6eb2a4c1e365b91cee437f936707e06c100fd64fe382db2538dd57dc360: Status 404 returned error can't find the container with id d6f5a6eb2a4c1e365b91cee437f936707e06c100fd64fe382db2538dd57dc360 Apr 17 16:54:15.059608 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:15.059579 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c655f6d99-4hfv9" event={"ID":"97a44237-c154-4a0b-b8ca-c4e5d84651c4","Type":"ContainerStarted","Data":"eec91463a9b2f354a8f8435baf26688ec74268fdf7ae858db2e9f465010517e9"} Apr 17 16:54:15.059713 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:15.059614 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c655f6d99-4hfv9" event={"ID":"97a44237-c154-4a0b-b8ca-c4e5d84651c4","Type":"ContainerStarted","Data":"d6f5a6eb2a4c1e365b91cee437f936707e06c100fd64fe382db2538dd57dc360"} Apr 17 16:54:15.062682 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:15.062659 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8","Type":"ContainerStarted","Data":"6565f02beeefc8daa974029680f8420623db71beee8d84033a9377bbd77e5432"} Apr 17 16:54:15.062682 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:15.062685 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8","Type":"ContainerStarted","Data":"9909fa1c5f66747bf23b24b40cd1143d1801c1d1be73a0b4651202a57c33270d"} Apr 17 16:54:15.063046 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:15.062693 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8","Type":"ContainerStarted","Data":"906ee137452794f3190404a6373110b04efcfdf789ff1835de21a5e1cb8c234d"} Apr 17 16:54:15.079384 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:15.079292 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c655f6d99-4hfv9" podStartSLOduration=1.07927296 podStartE2EDuration="1.07927296s" podCreationTimestamp="2026-04-17 16:54:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:54:15.07800943 +0000 UTC m=+99.984894246" watchObservedRunningTime="2026-04-17 16:54:15.07927296 +0000 UTC m=+99.986157776" Apr 17 16:54:16.032472 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.032407 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" podUID="5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70" containerName="registry" containerID="cri-o://005ca94f7ff507962c0af1d70423cc4c2d4cf0826cebdb1701c5cc7c04f887e0" gracePeriod=30 Apr 17 16:54:16.068375 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.068343 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8","Type":"ContainerStarted","Data":"1eb656c40a57a84992cb8f0ebbb38ee86b307025ba55d276cc81ec3b2cfad471"} Apr 17 16:54:16.068375 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.068379 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8","Type":"ContainerStarted","Data":"04dfa7028adad692c05a69a067da0490a0f2fd17eec58745421c177993c0a122"} Apr 17 16:54:16.068832 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.068389 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8","Type":"ContainerStarted","Data":"beca15a9764c9d7df18e179636968d4963642cdda0ca0f5dad6566ab6f246e4b"} Apr 17 16:54:16.098156 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.098085 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.989502912 podStartE2EDuration="7.098069353s" podCreationTimestamp="2026-04-17 16:54:09 +0000 UTC" firstStartedPulling="2026-04-17 16:54:10.663108032 +0000 UTC m=+95.569992826" lastFinishedPulling="2026-04-17 16:54:14.771674454 +0000 UTC m=+99.678559267" observedRunningTime="2026-04-17 16:54:16.094947693 +0000 UTC m=+101.001832509" watchObservedRunningTime="2026-04-17 16:54:16.098069353 +0000 UTC m=+101.004954168" Apr 17 16:54:16.287124 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.287062 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:54:16.346726 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.346696 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-bound-sa-token\") pod \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " Apr 17 16:54:16.346865 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.346743 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdjb6\" (UniqueName: \"kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-kube-api-access-sdjb6\") pod \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " Apr 17 16:54:16.346865 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.346766 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-certificates\") pod \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " Apr 17 16:54:16.346865 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.346789 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-trusted-ca\") pod \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " Apr 17 16:54:16.346865 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.346808 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-ca-trust-extracted\") pod \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " Apr 17 16:54:16.346865 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.346828 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-installation-pull-secrets\") pod \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " Apr 17 16:54:16.347094 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.346866 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-image-registry-private-configuration\") pod \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " Apr 17 16:54:16.347430 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.347171 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-tls\") pod \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\" (UID: \"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70\") " Apr 17 16:54:16.347430 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.347172 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70" (UID: "5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:54:16.347430 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.347338 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70" (UID: "5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:54:16.347657 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.347572 2568 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-certificates\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:54:16.347657 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.347596 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-trusted-ca\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:54:16.349357 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.349326 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70" (UID: "5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:54:16.350349 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.350314 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70" (UID: "5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:54:16.350447 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.350375 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70" (UID: "5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:54:16.350447 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.350389 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-kube-api-access-sdjb6" (OuterVolumeSpecName: "kube-api-access-sdjb6") pod "5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70" (UID: "5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70"). InnerVolumeSpecName "kube-api-access-sdjb6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:54:16.350447 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.350439 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70" (UID: "5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:54:16.357867 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.357841 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70" (UID: "5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:54:16.448558 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.448526 2568 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-bound-sa-token\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:54:16.448558 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.448553 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sdjb6\" (UniqueName: \"kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-kube-api-access-sdjb6\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:54:16.448739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.448569 2568 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-ca-trust-extracted\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:54:16.448739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.448583 2568 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-installation-pull-secrets\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:54:16.448739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.448596 2568 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-image-registry-private-configuration\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:54:16.448739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.448610 2568 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70-registry-tls\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:54:16.924946 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:16.924893 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-blsng" Apr 17 16:54:17.054613 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:17.054586 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-757c9669d9-7849l" Apr 17 16:54:17.072929 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:17.072897 2568 generic.go:358] "Generic (PLEG): container finished" podID="5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70" containerID="005ca94f7ff507962c0af1d70423cc4c2d4cf0826cebdb1701c5cc7c04f887e0" exitCode=0 Apr 17 16:54:17.073321 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:17.072964 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" Apr 17 16:54:17.073321 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:17.072988 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" event={"ID":"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70","Type":"ContainerDied","Data":"005ca94f7ff507962c0af1d70423cc4c2d4cf0826cebdb1701c5cc7c04f887e0"} Apr 17 16:54:17.073321 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:17.073025 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-588fbfbcd9-ddtmz" event={"ID":"5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70","Type":"ContainerDied","Data":"fdd43cb070dff14c828be0f2e06c9d7002075497b9d80188703be49dc7b1215f"} Apr 17 16:54:17.073321 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:17.073066 2568 scope.go:117] "RemoveContainer" containerID="005ca94f7ff507962c0af1d70423cc4c2d4cf0826cebdb1701c5cc7c04f887e0" Apr 17 16:54:17.083752 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:17.083733 2568 scope.go:117] "RemoveContainer" containerID="005ca94f7ff507962c0af1d70423cc4c2d4cf0826cebdb1701c5cc7c04f887e0" Apr 17 16:54:17.084086 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:54:17.084065 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"005ca94f7ff507962c0af1d70423cc4c2d4cf0826cebdb1701c5cc7c04f887e0\": container with ID starting with 005ca94f7ff507962c0af1d70423cc4c2d4cf0826cebdb1701c5cc7c04f887e0 not found: ID does not exist" containerID="005ca94f7ff507962c0af1d70423cc4c2d4cf0826cebdb1701c5cc7c04f887e0" Apr 17 16:54:17.084174 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:17.084097 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"005ca94f7ff507962c0af1d70423cc4c2d4cf0826cebdb1701c5cc7c04f887e0"} err="failed to get container status \"005ca94f7ff507962c0af1d70423cc4c2d4cf0826cebdb1701c5cc7c04f887e0\": rpc error: code = NotFound desc = could not find container \"005ca94f7ff507962c0af1d70423cc4c2d4cf0826cebdb1701c5cc7c04f887e0\": container with ID starting with 005ca94f7ff507962c0af1d70423cc4c2d4cf0826cebdb1701c5cc7c04f887e0 not found: ID does not exist" Apr 17 16:54:17.097093 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:17.097066 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-588fbfbcd9-ddtmz"] Apr 17 16:54:17.102393 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:17.102370 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-588fbfbcd9-ddtmz"] Apr 17 16:54:17.639089 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:17.639058 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70" path="/var/lib/kubelet/pods/5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70/volumes" Apr 17 16:54:20.292311 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:20.292271 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:54:24.377790 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:24.377752 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:24.378249 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:24.378072 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:24.382317 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:24.382298 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:25.104787 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:25.104761 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:54:25.159423 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:25.159394 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d7978c455-g5j8f"] Apr 17 16:54:40.145942 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:40.145891 2568 generic.go:358] "Generic (PLEG): container finished" podID="4a8134e0-b6b3-45d5-a4b4-a9b544913d40" containerID="8f4f348b737df9afd73cf7255ab1e56a100db76cddc05577c8ab9311da7e123d" exitCode=0 Apr 17 16:54:40.146372 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:40.145954 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-lttq7" event={"ID":"4a8134e0-b6b3-45d5-a4b4-a9b544913d40","Type":"ContainerDied","Data":"8f4f348b737df9afd73cf7255ab1e56a100db76cddc05577c8ab9311da7e123d"} Apr 17 16:54:40.146372 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:40.146345 2568 scope.go:117] "RemoveContainer" containerID="8f4f348b737df9afd73cf7255ab1e56a100db76cddc05577c8ab9311da7e123d" Apr 17 16:54:41.150353 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:41.150321 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-lttq7" event={"ID":"4a8134e0-b6b3-45d5-a4b4-a9b544913d40","Type":"ContainerStarted","Data":"07abcc8789ceff0a38da630275c33bbfe0e5a34948891d3576aba7d90089623a"} Apr 17 16:54:50.178570 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.178510 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7d7978c455-g5j8f" podUID="9dc4fa69-c69e-48f9-8c29-6b43c3163ddf" containerName="console" containerID="cri-o://8be2b9f02df019e3352f41d29675c15efbcea28fedf41250b673d9148fba4621" gracePeriod=15 Apr 17 16:54:50.182311 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.182282 2568 generic.go:358] "Generic (PLEG): container finished" podID="36f13da0-be40-496a-a83a-c62049f5690b" containerID="a1831fe234552656cb1e53637653d2728a40a59dacdd6817a2ab270c175c1633" exitCode=0 Apr 17 16:54:50.182412 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.182353 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-td7gx" event={"ID":"36f13da0-be40-496a-a83a-c62049f5690b","Type":"ContainerDied","Data":"a1831fe234552656cb1e53637653d2728a40a59dacdd6817a2ab270c175c1633"} Apr 17 16:54:50.182681 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.182668 2568 scope.go:117] "RemoveContainer" containerID="a1831fe234552656cb1e53637653d2728a40a59dacdd6817a2ab270c175c1633" Apr 17 16:54:50.423315 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.423294 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d7978c455-g5j8f_9dc4fa69-c69e-48f9-8c29-6b43c3163ddf/console/0.log" Apr 17 16:54:50.423425 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.423351 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:54:50.452286 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.452209 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-oauth-serving-cert\") pod \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " Apr 17 16:54:50.452286 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.452263 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-console-serving-cert\") pod \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " Apr 17 16:54:50.452479 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.452304 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-service-ca\") pod \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " Apr 17 16:54:50.452479 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.452332 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhlmc\" (UniqueName: \"kubernetes.io/projected/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-kube-api-access-fhlmc\") pod \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " Apr 17 16:54:50.452479 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.452376 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-console-oauth-config\") pod \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " Apr 17 16:54:50.452479 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.452425 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-console-config\") pod \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\" (UID: \"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf\") " Apr 17 16:54:50.452655 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.452635 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9dc4fa69-c69e-48f9-8c29-6b43c3163ddf" (UID: "9dc4fa69-c69e-48f9-8c29-6b43c3163ddf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:54:50.452744 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.452722 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-oauth-serving-cert\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:54:50.452964 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.452760 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-service-ca" (OuterVolumeSpecName: "service-ca") pod "9dc4fa69-c69e-48f9-8c29-6b43c3163ddf" (UID: "9dc4fa69-c69e-48f9-8c29-6b43c3163ddf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:54:50.453270 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.453236 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-console-config" (OuterVolumeSpecName: "console-config") pod "9dc4fa69-c69e-48f9-8c29-6b43c3163ddf" (UID: "9dc4fa69-c69e-48f9-8c29-6b43c3163ddf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:54:50.455164 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.455136 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-kube-api-access-fhlmc" (OuterVolumeSpecName: "kube-api-access-fhlmc") pod "9dc4fa69-c69e-48f9-8c29-6b43c3163ddf" (UID: "9dc4fa69-c69e-48f9-8c29-6b43c3163ddf"). InnerVolumeSpecName "kube-api-access-fhlmc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:54:50.455252 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.455189 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9dc4fa69-c69e-48f9-8c29-6b43c3163ddf" (UID: "9dc4fa69-c69e-48f9-8c29-6b43c3163ddf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:54:50.455310 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.455290 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9dc4fa69-c69e-48f9-8c29-6b43c3163ddf" (UID: "9dc4fa69-c69e-48f9-8c29-6b43c3163ddf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:54:50.553540 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.553501 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-console-config\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:54:50.553540 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.553535 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-console-serving-cert\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:54:50.553540 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.553545 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-service-ca\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:54:50.553822 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.553554 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fhlmc\" (UniqueName: \"kubernetes.io/projected/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-kube-api-access-fhlmc\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:54:50.553822 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:50.553564 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf-console-oauth-config\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:54:51.186996 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:51.186971 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d7978c455-g5j8f_9dc4fa69-c69e-48f9-8c29-6b43c3163ddf/console/0.log" Apr 17 16:54:51.187484 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:51.187008 2568 generic.go:358] "Generic (PLEG): container finished" podID="9dc4fa69-c69e-48f9-8c29-6b43c3163ddf" containerID="8be2b9f02df019e3352f41d29675c15efbcea28fedf41250b673d9148fba4621" exitCode=2 Apr 17 16:54:51.187484 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:51.187080 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d7978c455-g5j8f" Apr 17 16:54:51.187484 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:51.187093 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d7978c455-g5j8f" event={"ID":"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf","Type":"ContainerDied","Data":"8be2b9f02df019e3352f41d29675c15efbcea28fedf41250b673d9148fba4621"} Apr 17 16:54:51.187484 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:51.187130 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d7978c455-g5j8f" event={"ID":"9dc4fa69-c69e-48f9-8c29-6b43c3163ddf","Type":"ContainerDied","Data":"f3ee457743a21d50a987199676207013ad9afcc3e99f1e736d9d50bb0ba2bd59"} Apr 17 16:54:51.187484 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:51.187150 2568 scope.go:117] "RemoveContainer" containerID="8be2b9f02df019e3352f41d29675c15efbcea28fedf41250b673d9148fba4621" Apr 17 16:54:51.188996 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:51.188972 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-td7gx" event={"ID":"36f13da0-be40-496a-a83a-c62049f5690b","Type":"ContainerStarted","Data":"778e0f2237470e28224b1f68506cf83db33082c8e091f64e4ff74d3e892eefa2"} Apr 17 16:54:51.196907 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:51.196890 2568 scope.go:117] "RemoveContainer" containerID="8be2b9f02df019e3352f41d29675c15efbcea28fedf41250b673d9148fba4621" Apr 17 16:54:51.197139 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:54:51.197120 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be2b9f02df019e3352f41d29675c15efbcea28fedf41250b673d9148fba4621\": container with ID starting with 8be2b9f02df019e3352f41d29675c15efbcea28fedf41250b673d9148fba4621 not found: ID does not exist" containerID="8be2b9f02df019e3352f41d29675c15efbcea28fedf41250b673d9148fba4621" Apr 17 16:54:51.197208 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:51.197145 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be2b9f02df019e3352f41d29675c15efbcea28fedf41250b673d9148fba4621"} err="failed to get container status \"8be2b9f02df019e3352f41d29675c15efbcea28fedf41250b673d9148fba4621\": rpc error: code = NotFound desc = could not find container \"8be2b9f02df019e3352f41d29675c15efbcea28fedf41250b673d9148fba4621\": container with ID starting with 8be2b9f02df019e3352f41d29675c15efbcea28fedf41250b673d9148fba4621 not found: ID does not exist" Apr 17 16:54:51.223155 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:51.223128 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d7978c455-g5j8f"] Apr 17 16:54:51.229285 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:51.229259 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7d7978c455-g5j8f"] Apr 17 16:54:51.638440 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:54:51.638407 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dc4fa69-c69e-48f9-8c29-6b43c3163ddf" path="/var/lib/kubelet/pods/9dc4fa69-c69e-48f9-8c29-6b43c3163ddf/volumes" Apr 17 16:55:10.292368 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:10.292339 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:10.311230 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:10.311208 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:11.267061 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:11.267030 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:23.290700 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:23.290670 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-c7rtc_30e3b557-37e3-4fa9-9974-a7e12eff41fb/cluster-monitoring-operator/0.log" Apr 17 16:55:23.291125 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:23.290710 2568 generic.go:358] "Generic (PLEG): container finished" podID="30e3b557-37e3-4fa9-9974-a7e12eff41fb" containerID="e9894464e958d3b480e92aa56a4bb9608082dda0b1046c16d4064bdc8cddcf18" exitCode=2 Apr 17 16:55:23.291125 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:23.290767 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" event={"ID":"30e3b557-37e3-4fa9-9974-a7e12eff41fb","Type":"ContainerDied","Data":"e9894464e958d3b480e92aa56a4bb9608082dda0b1046c16d4064bdc8cddcf18"} Apr 17 16:55:23.291125 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:23.291091 2568 scope.go:117] "RemoveContainer" containerID="e9894464e958d3b480e92aa56a4bb9608082dda0b1046c16d4064bdc8cddcf18" Apr 17 16:55:24.295099 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:24.295072 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-c7rtc_30e3b557-37e3-4fa9-9974-a7e12eff41fb/cluster-monitoring-operator/0.log" Apr 17 16:55:24.295471 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:24.295165 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7rtc" event={"ID":"30e3b557-37e3-4fa9-9974-a7e12eff41fb","Type":"ContainerStarted","Data":"dfe4a78113f2ca7c7270fe512cc87f47993a43e7cf3199791a0176dfd9d42dc2"} Apr 17 16:55:32.070810 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.070771 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:55:32.071349 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.071255 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="prometheus" containerID="cri-o://906ee137452794f3190404a6373110b04efcfdf789ff1835de21a5e1cb8c234d" gracePeriod=600 Apr 17 16:55:32.071349 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.071270 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="kube-rbac-proxy" containerID="cri-o://beca15a9764c9d7df18e179636968d4963642cdda0ca0f5dad6566ab6f246e4b" gracePeriod=600 Apr 17 16:55:32.071349 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.071292 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="thanos-sidecar" containerID="cri-o://6565f02beeefc8daa974029680f8420623db71beee8d84033a9377bbd77e5432" gracePeriod=600 Apr 17 16:55:32.071553 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.071352 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="kube-rbac-proxy-web" containerID="cri-o://1eb656c40a57a84992cb8f0ebbb38ee86b307025ba55d276cc81ec3b2cfad471" gracePeriod=600 Apr 17 16:55:32.071553 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.071328 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="kube-rbac-proxy-thanos" containerID="cri-o://04dfa7028adad692c05a69a067da0490a0f2fd17eec58745421c177993c0a122" gracePeriod=600 Apr 17 16:55:32.071553 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.071391 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="config-reloader" containerID="cri-o://9909fa1c5f66747bf23b24b40cd1143d1801c1d1be73a0b4651202a57c33270d" gracePeriod=600 Apr 17 16:55:32.321090 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.321027 2568 generic.go:358] "Generic (PLEG): container finished" podID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerID="04dfa7028adad692c05a69a067da0490a0f2fd17eec58745421c177993c0a122" exitCode=0 Apr 17 16:55:32.321090 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.321050 2568 generic.go:358] "Generic (PLEG): container finished" podID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerID="beca15a9764c9d7df18e179636968d4963642cdda0ca0f5dad6566ab6f246e4b" exitCode=0 Apr 17 16:55:32.321090 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.321057 2568 generic.go:358] "Generic (PLEG): container finished" podID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerID="1eb656c40a57a84992cb8f0ebbb38ee86b307025ba55d276cc81ec3b2cfad471" exitCode=0 Apr 17 16:55:32.321090 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.321063 2568 generic.go:358] "Generic (PLEG): container finished" podID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerID="6565f02beeefc8daa974029680f8420623db71beee8d84033a9377bbd77e5432" exitCode=0 Apr 17 16:55:32.321090 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.321068 2568 generic.go:358] "Generic (PLEG): container finished" podID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerID="9909fa1c5f66747bf23b24b40cd1143d1801c1d1be73a0b4651202a57c33270d" exitCode=0 Apr 17 16:55:32.321090 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.321076 2568 generic.go:358] "Generic (PLEG): container finished" podID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerID="906ee137452794f3190404a6373110b04efcfdf789ff1835de21a5e1cb8c234d" exitCode=0 Apr 17 16:55:32.321335 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.321097 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8","Type":"ContainerDied","Data":"04dfa7028adad692c05a69a067da0490a0f2fd17eec58745421c177993c0a122"} Apr 17 16:55:32.321335 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.321128 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8","Type":"ContainerDied","Data":"beca15a9764c9d7df18e179636968d4963642cdda0ca0f5dad6566ab6f246e4b"} Apr 17 16:55:32.321335 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.321137 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8","Type":"ContainerDied","Data":"1eb656c40a57a84992cb8f0ebbb38ee86b307025ba55d276cc81ec3b2cfad471"} Apr 17 16:55:32.321335 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.321147 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8","Type":"ContainerDied","Data":"6565f02beeefc8daa974029680f8420623db71beee8d84033a9377bbd77e5432"} Apr 17 16:55:32.321335 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.321155 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8","Type":"ContainerDied","Data":"9909fa1c5f66747bf23b24b40cd1143d1801c1d1be73a0b4651202a57c33270d"} Apr 17 16:55:32.321335 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.321164 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8","Type":"ContainerDied","Data":"906ee137452794f3190404a6373110b04efcfdf789ff1835de21a5e1cb8c234d"} Apr 17 16:55:32.333084 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.333059 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:32.511609 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.511571 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-configmap-serving-certs-ca-bundle\") pod \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " Apr 17 16:55:32.511793 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.511622 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-kube-rbac-proxy\") pod \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " Apr 17 16:55:32.511793 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.511663 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-prometheus-k8s-db\") pod \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " Apr 17 16:55:32.511793 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.511681 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg984\" (UniqueName: \"kubernetes.io/projected/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-kube-api-access-sg984\") pod \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " Apr 17 16:55:32.511793 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.511696 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-web-config\") pod \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " Apr 17 16:55:32.511793 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.511730 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-configmap-kubelet-serving-ca-bundle\") pod \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " Apr 17 16:55:32.512055 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.511835 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-config-out\") pod \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " Apr 17 16:55:32.512055 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.511881 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-configmap-metrics-client-ca\") pod \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " Apr 17 16:55:32.512055 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.511946 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-config\") pod \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " Apr 17 16:55:32.512055 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.511980 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " Apr 17 16:55:32.512055 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.512029 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-thanos-prometheus-http-client-file\") pod \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " Apr 17 16:55:32.512311 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.512069 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-prometheus-trusted-ca-bundle\") pod \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " Apr 17 16:55:32.512311 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.512094 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-prometheus-k8s-rulefiles-0\") pod \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " Apr 17 16:55:32.512311 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.512109 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" (UID: "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:55:32.512311 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.512128 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-metrics-client-certs\") pod \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " Apr 17 16:55:32.512311 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.512153 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-prometheus-k8s-tls\") pod \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " Apr 17 16:55:32.512311 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.512181 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " Apr 17 16:55:32.512311 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.512211 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-grpc-tls\") pod \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " Apr 17 16:55:32.512311 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.512258 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-tls-assets\") pod \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\" (UID: \"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8\") " Apr 17 16:55:32.512705 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.512347 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" (UID: "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:55:32.512705 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.512560 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:55:32.512705 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.512584 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:55:32.512705 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.512640 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" (UID: "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:32.512705 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.512688 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" (UID: "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:55:32.514580 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.514552 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-config-out" (OuterVolumeSpecName: "config-out") pod "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" (UID: "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:32.515090 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.514817 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-kube-api-access-sg984" (OuterVolumeSpecName: "kube-api-access-sg984") pod "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" (UID: "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8"). InnerVolumeSpecName "kube-api-access-sg984". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:55:32.515199 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.515140 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" (UID: "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:55:32.515199 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.515172 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-config" (OuterVolumeSpecName: "config") pod "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" (UID: "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:55:32.515337 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.515195 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" (UID: "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:55:32.515337 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.515235 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" (UID: "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:55:32.515337 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.515256 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" (UID: "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:55:32.515862 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.515836 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" (UID: "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:55:32.516391 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.516338 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" (UID: "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:55:32.516719 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.516699 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" (UID: "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:55:32.517097 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.517080 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" (UID: "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:55:32.517248 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.517234 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" (UID: "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:55:32.517381 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.517369 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" (UID: "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:55:32.525780 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.525755 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-web-config" (OuterVolumeSpecName: "web-config") pod "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" (UID: "9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:55:32.613259 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.613188 2568 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-kube-rbac-proxy\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:55:32.613259 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.613220 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-prometheus-k8s-db\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:55:32.613259 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.613235 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sg984\" (UniqueName: \"kubernetes.io/projected/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-kube-api-access-sg984\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:55:32.613259 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.613251 2568 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-web-config\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:55:32.613503 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.613267 2568 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-config-out\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:55:32.613503 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.613280 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-configmap-metrics-client-ca\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:55:32.613503 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.613292 2568 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-config\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:55:32.613503 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.613307 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:55:32.613503 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.613323 2568 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-thanos-prometheus-http-client-file\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:55:32.613503 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.613338 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-prometheus-trusted-ca-bundle\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:55:32.613503 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.613352 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:55:32.613503 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.613366 2568 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-metrics-client-certs\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:55:32.613503 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.613381 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-prometheus-k8s-tls\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:55:32.613503 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.613396 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:55:32.613503 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.613411 2568 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-secret-grpc-tls\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:55:32.613503 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:32.613425 2568 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8-tls-assets\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:55:33.327431 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.327398 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8","Type":"ContainerDied","Data":"25397a07207b747d51d81da58233732ed93e4266b67d76e126adaa662fc1291f"} Apr 17 16:55:33.327860 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.327444 2568 scope.go:117] "RemoveContainer" containerID="04dfa7028adad692c05a69a067da0490a0f2fd17eec58745421c177993c0a122" Apr 17 16:55:33.327860 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.327476 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.335078 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.335060 2568 scope.go:117] "RemoveContainer" containerID="beca15a9764c9d7df18e179636968d4963642cdda0ca0f5dad6566ab6f246e4b" Apr 17 16:55:33.341739 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.341722 2568 scope.go:117] "RemoveContainer" containerID="1eb656c40a57a84992cb8f0ebbb38ee86b307025ba55d276cc81ec3b2cfad471" Apr 17 16:55:33.347837 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.347822 2568 scope.go:117] "RemoveContainer" containerID="6565f02beeefc8daa974029680f8420623db71beee8d84033a9377bbd77e5432" Apr 17 16:55:33.352835 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.352813 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:55:33.354978 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.354961 2568 scope.go:117] "RemoveContainer" containerID="9909fa1c5f66747bf23b24b40cd1143d1801c1d1be73a0b4651202a57c33270d" Apr 17 16:55:33.357726 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.357707 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:55:33.361644 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.361624 2568 scope.go:117] "RemoveContainer" containerID="906ee137452794f3190404a6373110b04efcfdf789ff1835de21a5e1cb8c234d" Apr 17 16:55:33.368190 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.368173 2568 scope.go:117] "RemoveContainer" containerID="bd24ffa4a7a8e8bde9da58fef6c3249660f60e8474397f71346d28d6efe41d2f" Apr 17 16:55:33.387999 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.387981 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:55:33.388321 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388306 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70" containerName="registry" Apr 17 16:55:33.388395 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388325 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70" containerName="registry" Apr 17 16:55:33.388395 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388341 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="config-reloader" Apr 17 16:55:33.388395 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388349 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="config-reloader" Apr 17 16:55:33.388395 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388364 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="init-config-reloader" Apr 17 16:55:33.388395 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388373 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="init-config-reloader" Apr 17 16:55:33.388395 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388389 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="thanos-sidecar" Apr 17 16:55:33.388684 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388398 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="thanos-sidecar" Apr 17 16:55:33.388684 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388409 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="prometheus" Apr 17 16:55:33.388684 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388419 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="prometheus" Apr 17 16:55:33.388684 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388439 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="kube-rbac-proxy" Apr 17 16:55:33.388684 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388447 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="kube-rbac-proxy" Apr 17 16:55:33.388684 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388456 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="kube-rbac-proxy-thanos" Apr 17 16:55:33.388684 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388464 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="kube-rbac-proxy-thanos" Apr 17 16:55:33.388684 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388477 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9dc4fa69-c69e-48f9-8c29-6b43c3163ddf" containerName="console" Apr 17 16:55:33.388684 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388493 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc4fa69-c69e-48f9-8c29-6b43c3163ddf" containerName="console" Apr 17 16:55:33.388684 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388509 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="kube-rbac-proxy-web" Apr 17 16:55:33.388684 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388518 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="kube-rbac-proxy-web" Apr 17 16:55:33.388684 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388596 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d5d3a6d-2d6e-4bf8-bcc4-8c44ed244a70" containerName="registry" Apr 17 16:55:33.388684 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388610 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="kube-rbac-proxy-thanos" Apr 17 16:55:33.388684 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388621 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="prometheus" Apr 17 16:55:33.388684 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388632 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="config-reloader" Apr 17 16:55:33.388684 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388642 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="thanos-sidecar" Apr 17 16:55:33.388684 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388651 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9dc4fa69-c69e-48f9-8c29-6b43c3163ddf" containerName="console" Apr 17 16:55:33.388684 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388663 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="kube-rbac-proxy-web" Apr 17 16:55:33.388684 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.388674 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" containerName="kube-rbac-proxy" Apr 17 16:55:33.394358 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.394340 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.400125 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.400104 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 16:55:33.400402 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.400345 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 16:55:33.400726 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.400706 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 16:55:33.400806 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.400726 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 16:55:33.401292 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.401273 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 16:55:33.401539 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.401523 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 16:55:33.401748 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.401726 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-11djqkr3n0v60\"" Apr 17 16:55:33.401810 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.401730 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-tlgmm\"" Apr 17 16:55:33.401810 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.401801 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 16:55:33.402082 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.402067 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 16:55:33.402259 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.402242 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 16:55:33.402328 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.402306 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 16:55:33.402543 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.402523 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 16:55:33.402623 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.402606 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 16:55:33.409508 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.409486 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:55:33.412405 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.412362 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 16:55:33.519693 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.519656 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.519693 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.519698 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc5jn\" (UniqueName: \"kubernetes.io/projected/728f29e0-5be4-46bb-ac36-eaa4660503ad-kube-api-access-hc5jn\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.520022 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.519728 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-config\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.520022 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.519776 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/728f29e0-5be4-46bb-ac36-eaa4660503ad-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.520022 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.519802 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-web-config\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.520022 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.519844 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.520022 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.519875 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/728f29e0-5be4-46bb-ac36-eaa4660503ad-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.520022 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.519899 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/728f29e0-5be4-46bb-ac36-eaa4660503ad-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.520022 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.519945 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.520022 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.519989 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/728f29e0-5be4-46bb-ac36-eaa4660503ad-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.520022 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.520015 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.520402 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.520038 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/728f29e0-5be4-46bb-ac36-eaa4660503ad-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.520402 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.520082 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.520402 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.520121 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/728f29e0-5be4-46bb-ac36-eaa4660503ad-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.520402 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.520145 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.520402 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.520174 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/728f29e0-5be4-46bb-ac36-eaa4660503ad-config-out\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.520402 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.520197 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/728f29e0-5be4-46bb-ac36-eaa4660503ad-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.520402 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.520256 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.621551 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.621472 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/728f29e0-5be4-46bb-ac36-eaa4660503ad-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.621551 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.621507 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-web-config\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.621551 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.621527 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.621551 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.621548 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/728f29e0-5be4-46bb-ac36-eaa4660503ad-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.621853 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.621564 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/728f29e0-5be4-46bb-ac36-eaa4660503ad-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.621853 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.621699 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.621853 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.621769 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/728f29e0-5be4-46bb-ac36-eaa4660503ad-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.621853 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.621802 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.621853 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.621829 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/728f29e0-5be4-46bb-ac36-eaa4660503ad-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.622119 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.621878 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.622119 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.621949 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/728f29e0-5be4-46bb-ac36-eaa4660503ad-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.622119 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.621979 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.622119 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.622011 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/728f29e0-5be4-46bb-ac36-eaa4660503ad-config-out\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.622119 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.622040 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/728f29e0-5be4-46bb-ac36-eaa4660503ad-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.622119 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.622074 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.622119 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.622113 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.622448 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.622142 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hc5jn\" (UniqueName: \"kubernetes.io/projected/728f29e0-5be4-46bb-ac36-eaa4660503ad-kube-api-access-hc5jn\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.622448 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.622179 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-config\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.622448 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.622336 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/728f29e0-5be4-46bb-ac36-eaa4660503ad-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.622701 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.622673 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/728f29e0-5be4-46bb-ac36-eaa4660503ad-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.624987 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.624736 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.624987 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.624772 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/728f29e0-5be4-46bb-ac36-eaa4660503ad-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.624987 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.624873 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-web-config\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.624987 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.624873 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-config\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.625260 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.625054 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/728f29e0-5be4-46bb-ac36-eaa4660503ad-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.625260 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.625149 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.625812 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.625683 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.625812 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.625750 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/728f29e0-5be4-46bb-ac36-eaa4660503ad-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.626002 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.625878 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.626298 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.626278 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/728f29e0-5be4-46bb-ac36-eaa4660503ad-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.627112 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.627085 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/728f29e0-5be4-46bb-ac36-eaa4660503ad-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.627321 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.627302 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.627382 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.627317 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.627640 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.627622 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/728f29e0-5be4-46bb-ac36-eaa4660503ad-config-out\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.627692 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.627646 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/728f29e0-5be4-46bb-ac36-eaa4660503ad-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.634670 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.634649 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc5jn\" (UniqueName: \"kubernetes.io/projected/728f29e0-5be4-46bb-ac36-eaa4660503ad-kube-api-access-hc5jn\") pod \"prometheus-k8s-0\" (UID: \"728f29e0-5be4-46bb-ac36-eaa4660503ad\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.639202 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.639175 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8" path="/var/lib/kubelet/pods/9e6ab01b-20d1-430f-8cfa-98f6ffe0f7d8/volumes" Apr 17 16:55:33.705406 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.705384 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:33.832221 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:33.832195 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:55:33.834041 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:55:33.834015 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod728f29e0_5be4_46bb_ac36_eaa4660503ad.slice/crio-b39fc1d6b289580759f34cf05af7c0b694be102daba98effc5a6fec1322ef0de WatchSource:0}: Error finding container b39fc1d6b289580759f34cf05af7c0b694be102daba98effc5a6fec1322ef0de: Status 404 returned error can't find the container with id b39fc1d6b289580759f34cf05af7c0b694be102daba98effc5a6fec1322ef0de Apr 17 16:55:34.332548 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:34.332513 2568 generic.go:358] "Generic (PLEG): container finished" podID="728f29e0-5be4-46bb-ac36-eaa4660503ad" containerID="fad735c9abdf22ee8db3e7a03a2585d7320d857749a0f9a5b1737d7c2b2d3f08" exitCode=0 Apr 17 16:55:34.332964 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:34.332556 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"728f29e0-5be4-46bb-ac36-eaa4660503ad","Type":"ContainerDied","Data":"fad735c9abdf22ee8db3e7a03a2585d7320d857749a0f9a5b1737d7c2b2d3f08"} Apr 17 16:55:34.332964 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:34.332577 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"728f29e0-5be4-46bb-ac36-eaa4660503ad","Type":"ContainerStarted","Data":"b39fc1d6b289580759f34cf05af7c0b694be102daba98effc5a6fec1322ef0de"} Apr 17 16:55:35.338803 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:35.338775 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"728f29e0-5be4-46bb-ac36-eaa4660503ad","Type":"ContainerStarted","Data":"5c1a353371531f298959ea501ccfa7fe136d1cf79b78e10fbc7822d76ecccbc2"} Apr 17 16:55:35.339162 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:35.338811 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"728f29e0-5be4-46bb-ac36-eaa4660503ad","Type":"ContainerStarted","Data":"94c8770bbc57bda4673585432c8de1b11cc85d63d0981105bc3eaa9162bf14e2"} Apr 17 16:55:35.339162 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:35.338823 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"728f29e0-5be4-46bb-ac36-eaa4660503ad","Type":"ContainerStarted","Data":"23d5ba852cfd4ba1fa8b6fa04a5f02a3ca5ffba0f8681b9943dbcde677dccec9"} Apr 17 16:55:35.339162 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:35.338832 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"728f29e0-5be4-46bb-ac36-eaa4660503ad","Type":"ContainerStarted","Data":"777082f1dbbe6b0d29af730a724361190b947db6a5f2c947104f479b0ac01e64"} Apr 17 16:55:35.339162 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:35.338840 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"728f29e0-5be4-46bb-ac36-eaa4660503ad","Type":"ContainerStarted","Data":"093303400ba3dfa404a639ddd5b7cadcd2755e60febe0358ff12069cd684802a"} Apr 17 16:55:35.339162 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:35.338849 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"728f29e0-5be4-46bb-ac36-eaa4660503ad","Type":"ContainerStarted","Data":"8b1b972120a95ad156e61fec38a65f2eb4c81e9db566c137d6562b16c1dda380"} Apr 17 16:55:35.368197 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:35.368145 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.368126032 podStartE2EDuration="2.368126032s" podCreationTimestamp="2026-04-17 16:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:55:35.36583686 +0000 UTC m=+180.272721688" watchObservedRunningTime="2026-04-17 16:55:35.368126032 +0000 UTC m=+180.275010847" Apr 17 16:55:38.706017 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:38.705973 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:55:47.044176 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:55:47.044141 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c655f6d99-4hfv9"] Apr 17 16:56:12.063816 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.063759 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-c655f6d99-4hfv9" podUID="97a44237-c154-4a0b-b8ca-c4e5d84651c4" containerName="console" containerID="cri-o://eec91463a9b2f354a8f8435baf26688ec74268fdf7ae858db2e9f465010517e9" gracePeriod=15 Apr 17 16:56:12.299459 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.299437 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c655f6d99-4hfv9_97a44237-c154-4a0b-b8ca-c4e5d84651c4/console/0.log" Apr 17 16:56:12.299578 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.299501 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:56:12.444342 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.444261 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c655f6d99-4hfv9_97a44237-c154-4a0b-b8ca-c4e5d84651c4/console/0.log" Apr 17 16:56:12.444342 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.444305 2568 generic.go:358] "Generic (PLEG): container finished" podID="97a44237-c154-4a0b-b8ca-c4e5d84651c4" containerID="eec91463a9b2f354a8f8435baf26688ec74268fdf7ae858db2e9f465010517e9" exitCode=2 Apr 17 16:56:12.444530 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.444337 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c655f6d99-4hfv9" event={"ID":"97a44237-c154-4a0b-b8ca-c4e5d84651c4","Type":"ContainerDied","Data":"eec91463a9b2f354a8f8435baf26688ec74268fdf7ae858db2e9f465010517e9"} Apr 17 16:56:12.444530 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.444375 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c655f6d99-4hfv9" event={"ID":"97a44237-c154-4a0b-b8ca-c4e5d84651c4","Type":"ContainerDied","Data":"d6f5a6eb2a4c1e365b91cee437f936707e06c100fd64fe382db2538dd57dc360"} Apr 17 16:56:12.444530 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.444378 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c655f6d99-4hfv9" Apr 17 16:56:12.444530 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.444395 2568 scope.go:117] "RemoveContainer" containerID="eec91463a9b2f354a8f8435baf26688ec74268fdf7ae858db2e9f465010517e9" Apr 17 16:56:12.447571 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.447554 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/97a44237-c154-4a0b-b8ca-c4e5d84651c4-console-oauth-config\") pod \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " Apr 17 16:56:12.447671 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.447611 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-service-ca\") pod \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " Apr 17 16:56:12.447671 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.447644 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-927tr\" (UniqueName: \"kubernetes.io/projected/97a44237-c154-4a0b-b8ca-c4e5d84651c4-kube-api-access-927tr\") pod \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " Apr 17 16:56:12.447671 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.447664 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-oauth-serving-cert\") pod \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " Apr 17 16:56:12.447843 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.447684 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-trusted-ca-bundle\") pod \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " Apr 17 16:56:12.447843 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.447716 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/97a44237-c154-4a0b-b8ca-c4e5d84651c4-console-serving-cert\") pod \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " Apr 17 16:56:12.447843 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.447739 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-console-config\") pod \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\" (UID: \"97a44237-c154-4a0b-b8ca-c4e5d84651c4\") " Apr 17 16:56:12.448114 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.448087 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "97a44237-c154-4a0b-b8ca-c4e5d84651c4" (UID: "97a44237-c154-4a0b-b8ca-c4e5d84651c4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:56:12.448361 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.448197 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "97a44237-c154-4a0b-b8ca-c4e5d84651c4" (UID: "97a44237-c154-4a0b-b8ca-c4e5d84651c4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:56:12.448361 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.448244 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-console-config" (OuterVolumeSpecName: "console-config") pod "97a44237-c154-4a0b-b8ca-c4e5d84651c4" (UID: "97a44237-c154-4a0b-b8ca-c4e5d84651c4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:56:12.448537 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.448487 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-service-ca" (OuterVolumeSpecName: "service-ca") pod "97a44237-c154-4a0b-b8ca-c4e5d84651c4" (UID: "97a44237-c154-4a0b-b8ca-c4e5d84651c4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:56:12.450052 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.450028 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97a44237-c154-4a0b-b8ca-c4e5d84651c4-kube-api-access-927tr" (OuterVolumeSpecName: "kube-api-access-927tr") pod "97a44237-c154-4a0b-b8ca-c4e5d84651c4" (UID: "97a44237-c154-4a0b-b8ca-c4e5d84651c4"). InnerVolumeSpecName "kube-api-access-927tr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:56:12.450052 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.450035 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97a44237-c154-4a0b-b8ca-c4e5d84651c4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "97a44237-c154-4a0b-b8ca-c4e5d84651c4" (UID: "97a44237-c154-4a0b-b8ca-c4e5d84651c4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:56:12.450178 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.450056 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97a44237-c154-4a0b-b8ca-c4e5d84651c4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "97a44237-c154-4a0b-b8ca-c4e5d84651c4" (UID: "97a44237-c154-4a0b-b8ca-c4e5d84651c4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:56:12.453119 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.453104 2568 scope.go:117] "RemoveContainer" containerID="eec91463a9b2f354a8f8435baf26688ec74268fdf7ae858db2e9f465010517e9" Apr 17 16:56:12.453404 ip-10-0-134-88 kubenswrapper[2568]: E0417 16:56:12.453383 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eec91463a9b2f354a8f8435baf26688ec74268fdf7ae858db2e9f465010517e9\": container with ID starting with eec91463a9b2f354a8f8435baf26688ec74268fdf7ae858db2e9f465010517e9 not found: ID does not exist" containerID="eec91463a9b2f354a8f8435baf26688ec74268fdf7ae858db2e9f465010517e9" Apr 17 16:56:12.453445 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.453413 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec91463a9b2f354a8f8435baf26688ec74268fdf7ae858db2e9f465010517e9"} err="failed to get container status \"eec91463a9b2f354a8f8435baf26688ec74268fdf7ae858db2e9f465010517e9\": rpc error: code = NotFound desc = could not find container \"eec91463a9b2f354a8f8435baf26688ec74268fdf7ae858db2e9f465010517e9\": container with ID starting with eec91463a9b2f354a8f8435baf26688ec74268fdf7ae858db2e9f465010517e9 not found: ID does not exist" Apr 17 16:56:12.549212 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.549177 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/97a44237-c154-4a0b-b8ca-c4e5d84651c4-console-oauth-config\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:56:12.549212 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.549207 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-service-ca\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:56:12.549212 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.549218 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-927tr\" (UniqueName: \"kubernetes.io/projected/97a44237-c154-4a0b-b8ca-c4e5d84651c4-kube-api-access-927tr\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:56:12.549439 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.549227 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-oauth-serving-cert\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:56:12.549439 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.549263 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-trusted-ca-bundle\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:56:12.549439 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.549276 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/97a44237-c154-4a0b-b8ca-c4e5d84651c4-console-serving-cert\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:56:12.549439 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.549286 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/97a44237-c154-4a0b-b8ca-c4e5d84651c4-console-config\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 16:56:12.766691 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.766657 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c655f6d99-4hfv9"] Apr 17 16:56:12.770743 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:12.770719 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c655f6d99-4hfv9"] Apr 17 16:56:13.638677 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:13.638641 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97a44237-c154-4a0b-b8ca-c4e5d84651c4" path="/var/lib/kubelet/pods/97a44237-c154-4a0b-b8ca-c4e5d84651c4/volumes" Apr 17 16:56:33.705789 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:33.705743 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:56:33.721159 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:33.721129 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:56:34.529647 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:56:34.529621 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:57:35.519482 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:57:35.519449 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-c7rtc_30e3b557-37e3-4fa9-9974-a7e12eff41fb/cluster-monitoring-operator/0.log" Apr 17 16:57:35.520086 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:57:35.519659 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-c7rtc_30e3b557-37e3-4fa9-9974-a7e12eff41fb/cluster-monitoring-operator/0.log" Apr 17 16:57:35.527316 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:57:35.527297 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q5p9p_f461b687-3271-484f-a873-6a5fb0b1214d/ovn-acl-logging/0.log" Apr 17 16:57:35.527439 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:57:35.527353 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q5p9p_f461b687-3271-484f-a873-6a5fb0b1214d/ovn-acl-logging/0.log" Apr 17 16:57:35.533106 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:57:35.533089 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 16:58:14.378162 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:14.378130 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-cdck2"] Apr 17 16:58:14.378573 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:14.378440 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97a44237-c154-4a0b-b8ca-c4e5d84651c4" containerName="console" Apr 17 16:58:14.378573 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:14.378451 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="97a44237-c154-4a0b-b8ca-c4e5d84651c4" containerName="console" Apr 17 16:58:14.378573 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:14.378513 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="97a44237-c154-4a0b-b8ca-c4e5d84651c4" containerName="console" Apr 17 16:58:14.381595 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:14.381576 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-cdck2" Apr 17 16:58:14.384189 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:14.384166 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 16:58:14.384277 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:14.384262 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 16:58:14.385460 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:14.385443 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-b6kc6\"" Apr 17 16:58:14.388829 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:14.388489 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-cdck2"] Apr 17 16:58:14.510009 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:14.509968 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3523dd6-17a8-47f8-b7dd-b6ce28c642a6-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-cdck2\" (UID: \"c3523dd6-17a8-47f8-b7dd-b6ce28c642a6\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-cdck2" Apr 17 16:58:14.510181 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:14.510054 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7z8f\" (UniqueName: \"kubernetes.io/projected/c3523dd6-17a8-47f8-b7dd-b6ce28c642a6-kube-api-access-z7z8f\") pod \"cert-manager-cainjector-8966b78d4-cdck2\" (UID: \"c3523dd6-17a8-47f8-b7dd-b6ce28c642a6\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-cdck2" Apr 17 16:58:14.611255 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:14.611219 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3523dd6-17a8-47f8-b7dd-b6ce28c642a6-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-cdck2\" (UID: \"c3523dd6-17a8-47f8-b7dd-b6ce28c642a6\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-cdck2" Apr 17 16:58:14.611426 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:14.611303 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7z8f\" (UniqueName: \"kubernetes.io/projected/c3523dd6-17a8-47f8-b7dd-b6ce28c642a6-kube-api-access-z7z8f\") pod \"cert-manager-cainjector-8966b78d4-cdck2\" (UID: \"c3523dd6-17a8-47f8-b7dd-b6ce28c642a6\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-cdck2" Apr 17 16:58:14.619413 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:14.619384 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3523dd6-17a8-47f8-b7dd-b6ce28c642a6-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-cdck2\" (UID: \"c3523dd6-17a8-47f8-b7dd-b6ce28c642a6\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-cdck2" Apr 17 16:58:14.619511 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:14.619412 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7z8f\" (UniqueName: \"kubernetes.io/projected/c3523dd6-17a8-47f8-b7dd-b6ce28c642a6-kube-api-access-z7z8f\") pod \"cert-manager-cainjector-8966b78d4-cdck2\" (UID: \"c3523dd6-17a8-47f8-b7dd-b6ce28c642a6\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-cdck2" Apr 17 16:58:14.705700 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:14.705612 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-cdck2" Apr 17 16:58:14.824858 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:14.824802 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-cdck2"] Apr 17 16:58:14.827162 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:58:14.827139 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3523dd6_17a8_47f8_b7dd_b6ce28c642a6.slice/crio-33a6e592b3b1ffbf11de62443be73affe90cec9231e577de396bf795baf5799b WatchSource:0}: Error finding container 33a6e592b3b1ffbf11de62443be73affe90cec9231e577de396bf795baf5799b: Status 404 returned error can't find the container with id 33a6e592b3b1ffbf11de62443be73affe90cec9231e577de396bf795baf5799b Apr 17 16:58:14.829292 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:14.829277 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:58:15.825872 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:15.825836 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-cdck2" event={"ID":"c3523dd6-17a8-47f8-b7dd-b6ce28c642a6","Type":"ContainerStarted","Data":"33a6e592b3b1ffbf11de62443be73affe90cec9231e577de396bf795baf5799b"} Apr 17 16:58:18.837571 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:18.837530 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-cdck2" event={"ID":"c3523dd6-17a8-47f8-b7dd-b6ce28c642a6","Type":"ContainerStarted","Data":"e8cebbb6f5764938e03a2d3d73cbd587dde76e3ef8cdd7e2a6c1eaee05a40f93"} Apr 17 16:58:18.853639 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:18.853582 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-cdck2" podStartSLOduration=1.380244685 podStartE2EDuration="4.85356472s" podCreationTimestamp="2026-04-17 16:58:14 +0000 UTC" firstStartedPulling="2026-04-17 16:58:14.82940397 +0000 UTC m=+339.736288764" lastFinishedPulling="2026-04-17 16:58:18.302724005 +0000 UTC m=+343.209608799" observedRunningTime="2026-04-17 16:58:18.852583647 +0000 UTC m=+343.759468463" watchObservedRunningTime="2026-04-17 16:58:18.85356472 +0000 UTC m=+343.760449537" Apr 17 16:58:46.617668 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.617584 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9"] Apr 17 16:58:46.628508 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.628468 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9" Apr 17 16:58:46.629294 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.629257 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9"] Apr 17 16:58:46.632661 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.632636 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 16:58:46.632661 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.632648 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 16:58:46.632848 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.632663 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 16:58:46.632848 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.632663 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:58:46.632848 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.632740 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-lb5t5\"" Apr 17 16:58:46.632848 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.632647 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 16:58:46.671132 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.671101 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmzmb\" (UniqueName: \"kubernetes.io/projected/c8b63800-1697-4644-bece-e40e2ff7d4e1-kube-api-access-cmzmb\") pod \"lws-controller-manager-58fcc7cb5-hr5x9\" (UID: \"c8b63800-1697-4644-bece-e40e2ff7d4e1\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9" Apr 17 16:58:46.671250 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.671141 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/c8b63800-1697-4644-bece-e40e2ff7d4e1-metrics-cert\") pod \"lws-controller-manager-58fcc7cb5-hr5x9\" (UID: \"c8b63800-1697-4644-bece-e40e2ff7d4e1\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9" Apr 17 16:58:46.671250 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.671185 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c8b63800-1697-4644-bece-e40e2ff7d4e1-cert\") pod \"lws-controller-manager-58fcc7cb5-hr5x9\" (UID: \"c8b63800-1697-4644-bece-e40e2ff7d4e1\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9" Apr 17 16:58:46.671341 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.671273 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/c8b63800-1697-4644-bece-e40e2ff7d4e1-manager-config\") pod \"lws-controller-manager-58fcc7cb5-hr5x9\" (UID: \"c8b63800-1697-4644-bece-e40e2ff7d4e1\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9" Apr 17 16:58:46.772382 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.772344 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c8b63800-1697-4644-bece-e40e2ff7d4e1-cert\") pod \"lws-controller-manager-58fcc7cb5-hr5x9\" (UID: \"c8b63800-1697-4644-bece-e40e2ff7d4e1\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9" Apr 17 16:58:46.772565 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.772405 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/c8b63800-1697-4644-bece-e40e2ff7d4e1-manager-config\") pod \"lws-controller-manager-58fcc7cb5-hr5x9\" (UID: \"c8b63800-1697-4644-bece-e40e2ff7d4e1\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9" Apr 17 16:58:46.772565 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.772449 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmzmb\" (UniqueName: \"kubernetes.io/projected/c8b63800-1697-4644-bece-e40e2ff7d4e1-kube-api-access-cmzmb\") pod \"lws-controller-manager-58fcc7cb5-hr5x9\" (UID: \"c8b63800-1697-4644-bece-e40e2ff7d4e1\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9" Apr 17 16:58:46.772565 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.772474 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/c8b63800-1697-4644-bece-e40e2ff7d4e1-metrics-cert\") pod \"lws-controller-manager-58fcc7cb5-hr5x9\" (UID: \"c8b63800-1697-4644-bece-e40e2ff7d4e1\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9" Apr 17 16:58:46.773100 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.773075 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/c8b63800-1697-4644-bece-e40e2ff7d4e1-manager-config\") pod \"lws-controller-manager-58fcc7cb5-hr5x9\" (UID: \"c8b63800-1697-4644-bece-e40e2ff7d4e1\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9" Apr 17 16:58:46.774895 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.774873 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/c8b63800-1697-4644-bece-e40e2ff7d4e1-metrics-cert\") pod \"lws-controller-manager-58fcc7cb5-hr5x9\" (UID: \"c8b63800-1697-4644-bece-e40e2ff7d4e1\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9" Apr 17 16:58:46.774999 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.774941 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c8b63800-1697-4644-bece-e40e2ff7d4e1-cert\") pod \"lws-controller-manager-58fcc7cb5-hr5x9\" (UID: \"c8b63800-1697-4644-bece-e40e2ff7d4e1\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9" Apr 17 16:58:46.782368 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.782344 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmzmb\" (UniqueName: \"kubernetes.io/projected/c8b63800-1697-4644-bece-e40e2ff7d4e1-kube-api-access-cmzmb\") pod \"lws-controller-manager-58fcc7cb5-hr5x9\" (UID: \"c8b63800-1697-4644-bece-e40e2ff7d4e1\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9" Apr 17 16:58:46.938602 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:46.938502 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9" Apr 17 16:58:47.068910 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:47.068887 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9"] Apr 17 16:58:47.071424 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:58:47.071398 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8b63800_1697_4644_bece_e40e2ff7d4e1.slice/crio-e0e009e3532691547228622714d7650c961d58f04f90f0434d959f1e850e4b88 WatchSource:0}: Error finding container e0e009e3532691547228622714d7650c961d58f04f90f0434d959f1e850e4b88: Status 404 returned error can't find the container with id e0e009e3532691547228622714d7650c961d58f04f90f0434d959f1e850e4b88 Apr 17 16:58:47.923326 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:47.923283 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9" event={"ID":"c8b63800-1697-4644-bece-e40e2ff7d4e1","Type":"ContainerStarted","Data":"e0e009e3532691547228622714d7650c961d58f04f90f0434d959f1e850e4b88"} Apr 17 16:58:49.930054 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:49.930019 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9" event={"ID":"c8b63800-1697-4644-bece-e40e2ff7d4e1","Type":"ContainerStarted","Data":"ee566d67e34196bc42ab51028e83f93ab2e43f797b1e74bcd1b1e7e91b2ecb3e"} Apr 17 16:58:49.930054 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:49.930063 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9" Apr 17 16:58:49.966178 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:49.966133 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9" podStartSLOduration=1.731376341 podStartE2EDuration="3.96611733s" podCreationTimestamp="2026-04-17 16:58:46 +0000 UTC" firstStartedPulling="2026-04-17 16:58:47.073311518 +0000 UTC m=+371.980196317" lastFinishedPulling="2026-04-17 16:58:49.308052508 +0000 UTC m=+374.214937306" observedRunningTime="2026-04-17 16:58:49.964034685 +0000 UTC m=+374.870919502" watchObservedRunningTime="2026-04-17 16:58:49.96611733 +0000 UTC m=+374.873002145" Apr 17 16:58:54.272941 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:54.269462 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-x62t4"] Apr 17 16:58:54.275218 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:54.275184 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-x62t4" Apr 17 16:58:54.279942 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:54.278954 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 16:58:54.279942 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:54.278982 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 16:58:54.279942 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:54.279215 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 16:58:54.279942 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:54.279307 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-28vvs\"" Apr 17 16:58:54.285186 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:54.284940 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 16:58:54.287347 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:54.287325 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-x62t4"] Apr 17 16:58:54.334452 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:54.334413 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f230a3a3-c8d5-4cda-acbb-010cbaf55e9a-webhook-cert\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-x62t4\" (UID: \"f230a3a3-c8d5-4cda-acbb-010cbaf55e9a\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-x62t4" Apr 17 16:58:54.334601 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:54.334551 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtdps\" (UniqueName: \"kubernetes.io/projected/f230a3a3-c8d5-4cda-acbb-010cbaf55e9a-kube-api-access-dtdps\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-x62t4\" (UID: \"f230a3a3-c8d5-4cda-acbb-010cbaf55e9a\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-x62t4" Apr 17 16:58:54.334652 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:54.334599 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f230a3a3-c8d5-4cda-acbb-010cbaf55e9a-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-x62t4\" (UID: \"f230a3a3-c8d5-4cda-acbb-010cbaf55e9a\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-x62t4" Apr 17 16:58:54.435287 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:54.435253 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f230a3a3-c8d5-4cda-acbb-010cbaf55e9a-webhook-cert\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-x62t4\" (UID: \"f230a3a3-c8d5-4cda-acbb-010cbaf55e9a\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-x62t4" Apr 17 16:58:54.435456 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:54.435310 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtdps\" (UniqueName: \"kubernetes.io/projected/f230a3a3-c8d5-4cda-acbb-010cbaf55e9a-kube-api-access-dtdps\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-x62t4\" (UID: \"f230a3a3-c8d5-4cda-acbb-010cbaf55e9a\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-x62t4" Apr 17 16:58:54.435456 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:54.435338 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f230a3a3-c8d5-4cda-acbb-010cbaf55e9a-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-x62t4\" (UID: \"f230a3a3-c8d5-4cda-acbb-010cbaf55e9a\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-x62t4" Apr 17 16:58:54.437857 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:54.437833 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f230a3a3-c8d5-4cda-acbb-010cbaf55e9a-webhook-cert\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-x62t4\" (UID: \"f230a3a3-c8d5-4cda-acbb-010cbaf55e9a\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-x62t4" Apr 17 16:58:54.437986 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:54.437833 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f230a3a3-c8d5-4cda-acbb-010cbaf55e9a-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-x62t4\" (UID: \"f230a3a3-c8d5-4cda-acbb-010cbaf55e9a\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-x62t4" Apr 17 16:58:54.446414 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:54.446389 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtdps\" (UniqueName: \"kubernetes.io/projected/f230a3a3-c8d5-4cda-acbb-010cbaf55e9a-kube-api-access-dtdps\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-x62t4\" (UID: \"f230a3a3-c8d5-4cda-acbb-010cbaf55e9a\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-x62t4" Apr 17 16:58:54.591706 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:54.591619 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-x62t4" Apr 17 16:58:54.721973 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:54.721939 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-x62t4"] Apr 17 16:58:54.724722 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:58:54.724695 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf230a3a3_c8d5_4cda_acbb_010cbaf55e9a.slice/crio-4882e651a105601e6894f6d61546a253a78f4ed226d873ff3aae29dfa046b567 WatchSource:0}: Error finding container 4882e651a105601e6894f6d61546a253a78f4ed226d873ff3aae29dfa046b567: Status 404 returned error can't find the container with id 4882e651a105601e6894f6d61546a253a78f4ed226d873ff3aae29dfa046b567 Apr 17 16:58:54.945729 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:54.945648 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-x62t4" event={"ID":"f230a3a3-c8d5-4cda-acbb-010cbaf55e9a","Type":"ContainerStarted","Data":"4882e651a105601e6894f6d61546a253a78f4ed226d873ff3aae29dfa046b567"} Apr 17 16:58:57.957903 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:57.957871 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-x62t4" event={"ID":"f230a3a3-c8d5-4cda-acbb-010cbaf55e9a","Type":"ContainerStarted","Data":"87403afa827d561ba0ce646c463654fb261fe0115c068b9ad2ea00f6f2a72831"} Apr 17 16:58:57.958255 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:57.957946 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-x62t4" Apr 17 16:58:57.978158 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:58:57.978121 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-x62t4" podStartSLOduration=1.543479419 podStartE2EDuration="3.978109377s" podCreationTimestamp="2026-04-17 16:58:54 +0000 UTC" firstStartedPulling="2026-04-17 16:58:54.726285424 +0000 UTC m=+379.633170219" lastFinishedPulling="2026-04-17 16:58:57.160915383 +0000 UTC m=+382.067800177" observedRunningTime="2026-04-17 16:58:57.976624241 +0000 UTC m=+382.883509078" watchObservedRunningTime="2026-04-17 16:58:57.978109377 +0000 UTC m=+382.884994192" Apr 17 16:59:00.935145 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:00.935115 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-hr5x9" Apr 17 16:59:08.962911 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:08.962882 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-x62t4" Apr 17 16:59:11.639060 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:11.639031 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-764cf74f-d8lnj"] Apr 17 16:59:11.642861 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:11.642841 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-764cf74f-d8lnj" Apr 17 16:59:11.646983 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:11.646956 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 16:59:11.647112 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:11.647085 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 16:59:11.648160 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:11.648138 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 16:59:11.648271 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:11.648190 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-zsb4c\"" Apr 17 16:59:11.648271 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:11.648147 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 16:59:11.650435 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:11.650413 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-764cf74f-d8lnj"] Apr 17 16:59:11.783121 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:11.783094 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwssq\" (UniqueName: \"kubernetes.io/projected/87df2db0-639a-4670-8958-d99942902884-kube-api-access-fwssq\") pod \"kube-auth-proxy-764cf74f-d8lnj\" (UID: \"87df2db0-639a-4670-8958-d99942902884\") " pod="openshift-ingress/kube-auth-proxy-764cf74f-d8lnj" Apr 17 16:59:11.783265 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:11.783148 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/87df2db0-639a-4670-8958-d99942902884-tls-certs\") pod \"kube-auth-proxy-764cf74f-d8lnj\" (UID: \"87df2db0-639a-4670-8958-d99942902884\") " pod="openshift-ingress/kube-auth-proxy-764cf74f-d8lnj" Apr 17 16:59:11.783265 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:11.783250 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/87df2db0-639a-4670-8958-d99942902884-tmp\") pod \"kube-auth-proxy-764cf74f-d8lnj\" (UID: \"87df2db0-639a-4670-8958-d99942902884\") " pod="openshift-ingress/kube-auth-proxy-764cf74f-d8lnj" Apr 17 16:59:11.883804 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:11.883764 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/87df2db0-639a-4670-8958-d99942902884-tls-certs\") pod \"kube-auth-proxy-764cf74f-d8lnj\" (UID: \"87df2db0-639a-4670-8958-d99942902884\") " pod="openshift-ingress/kube-auth-proxy-764cf74f-d8lnj" Apr 17 16:59:11.883804 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:11.883817 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/87df2db0-639a-4670-8958-d99942902884-tmp\") pod \"kube-auth-proxy-764cf74f-d8lnj\" (UID: \"87df2db0-639a-4670-8958-d99942902884\") " pod="openshift-ingress/kube-auth-proxy-764cf74f-d8lnj" Apr 17 16:59:11.884034 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:11.883849 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwssq\" (UniqueName: \"kubernetes.io/projected/87df2db0-639a-4670-8958-d99942902884-kube-api-access-fwssq\") pod \"kube-auth-proxy-764cf74f-d8lnj\" (UID: \"87df2db0-639a-4670-8958-d99942902884\") " pod="openshift-ingress/kube-auth-proxy-764cf74f-d8lnj" Apr 17 16:59:11.885992 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:11.885964 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/87df2db0-639a-4670-8958-d99942902884-tmp\") pod \"kube-auth-proxy-764cf74f-d8lnj\" (UID: \"87df2db0-639a-4670-8958-d99942902884\") " pod="openshift-ingress/kube-auth-proxy-764cf74f-d8lnj" Apr 17 16:59:11.886149 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:11.886134 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/87df2db0-639a-4670-8958-d99942902884-tls-certs\") pod \"kube-auth-proxy-764cf74f-d8lnj\" (UID: \"87df2db0-639a-4670-8958-d99942902884\") " pod="openshift-ingress/kube-auth-proxy-764cf74f-d8lnj" Apr 17 16:59:11.892112 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:11.892069 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwssq\" (UniqueName: \"kubernetes.io/projected/87df2db0-639a-4670-8958-d99942902884-kube-api-access-fwssq\") pod \"kube-auth-proxy-764cf74f-d8lnj\" (UID: \"87df2db0-639a-4670-8958-d99942902884\") " pod="openshift-ingress/kube-auth-proxy-764cf74f-d8lnj" Apr 17 16:59:11.953666 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:11.953642 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-764cf74f-d8lnj" Apr 17 16:59:12.074244 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:12.074100 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-764cf74f-d8lnj"] Apr 17 16:59:12.076802 ip-10-0-134-88 kubenswrapper[2568]: W0417 16:59:12.076766 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87df2db0_639a_4670_8958_d99942902884.slice/crio-af753441131cd6617803169634f53e5eef9702fe5537d8a94d09f922c492079f WatchSource:0}: Error finding container af753441131cd6617803169634f53e5eef9702fe5537d8a94d09f922c492079f: Status 404 returned error can't find the container with id af753441131cd6617803169634f53e5eef9702fe5537d8a94d09f922c492079f Apr 17 16:59:13.007155 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:13.007111 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-764cf74f-d8lnj" event={"ID":"87df2db0-639a-4670-8958-d99942902884","Type":"ContainerStarted","Data":"af753441131cd6617803169634f53e5eef9702fe5537d8a94d09f922c492079f"} Apr 17 16:59:16.018259 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:16.018228 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-764cf74f-d8lnj" event={"ID":"87df2db0-639a-4670-8958-d99942902884","Type":"ContainerStarted","Data":"0b18d9027b5a5b601f34fa7f61483122f23ef3cf85a43114f3451248fc856f7a"} Apr 17 16:59:16.037384 ip-10-0-134-88 kubenswrapper[2568]: I0417 16:59:16.037329 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-764cf74f-d8lnj" podStartSLOduration=2.047185362 podStartE2EDuration="5.03731552s" podCreationTimestamp="2026-04-17 16:59:11 +0000 UTC" firstStartedPulling="2026-04-17 16:59:12.078660186 +0000 UTC m=+396.985544980" lastFinishedPulling="2026-04-17 16:59:15.068790329 +0000 UTC m=+399.975675138" observedRunningTime="2026-04-17 16:59:16.036278057 +0000 UTC m=+400.943162874" watchObservedRunningTime="2026-04-17 16:59:16.03731552 +0000 UTC m=+400.944200335" Apr 17 17:00:50.358842 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:00:50.358810 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-94rz7"] Apr 17 17:00:50.362377 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:00:50.362355 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-94rz7" Apr 17 17:00:50.365062 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:00:50.365042 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 17 17:00:50.365183 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:00:50.365097 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 17 17:00:50.365183 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:00:50.365139 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 17:00:50.365368 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:00:50.365353 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 17:00:50.366270 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:00:50.366252 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-rt5sd\"" Apr 17 17:00:50.369899 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:00:50.369677 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-94rz7"] Apr 17 17:00:50.398749 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:00:50.398729 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-94rz7\" (UID: \"c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-94rz7" Apr 17 17:00:50.398749 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:00:50.398756 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-94rz7\" (UID: \"c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-94rz7" Apr 17 17:00:50.398899 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:00:50.398863 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkqdd\" (UniqueName: \"kubernetes.io/projected/c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4-kube-api-access-gkqdd\") pod \"kuadrant-console-plugin-6cb54b5c86-94rz7\" (UID: \"c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-94rz7" Apr 17 17:00:50.499482 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:00:50.499443 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkqdd\" (UniqueName: \"kubernetes.io/projected/c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4-kube-api-access-gkqdd\") pod \"kuadrant-console-plugin-6cb54b5c86-94rz7\" (UID: \"c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-94rz7" Apr 17 17:00:50.499662 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:00:50.499608 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-94rz7\" (UID: \"c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-94rz7" Apr 17 17:00:50.499662 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:00:50.499632 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-94rz7\" (UID: \"c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-94rz7" Apr 17 17:00:50.500363 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:00:50.500343 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-94rz7\" (UID: \"c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-94rz7" Apr 17 17:00:50.502060 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:00:50.502040 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-94rz7\" (UID: \"c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-94rz7" Apr 17 17:00:50.507428 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:00:50.507402 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkqdd\" (UniqueName: \"kubernetes.io/projected/c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4-kube-api-access-gkqdd\") pod \"kuadrant-console-plugin-6cb54b5c86-94rz7\" (UID: \"c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-94rz7" Apr 17 17:00:50.672740 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:00:50.672661 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-94rz7" Apr 17 17:00:50.789034 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:00:50.789002 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-94rz7"] Apr 17 17:00:50.792045 ip-10-0-134-88 kubenswrapper[2568]: W0417 17:00:50.792016 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7cf22e0_3e00_479d_8f65_e3d46f6f5bb4.slice/crio-3534d670006e8ff912bf7b57d092d1fcc74a4d6764396abda6f870e864467c37 WatchSource:0}: Error finding container 3534d670006e8ff912bf7b57d092d1fcc74a4d6764396abda6f870e864467c37: Status 404 returned error can't find the container with id 3534d670006e8ff912bf7b57d092d1fcc74a4d6764396abda6f870e864467c37 Apr 17 17:00:51.333364 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:00:51.333328 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-94rz7" event={"ID":"c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4","Type":"ContainerStarted","Data":"3534d670006e8ff912bf7b57d092d1fcc74a4d6764396abda6f870e864467c37"} Apr 17 17:01:00.575830 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:00.575793 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-tbctd"] Apr 17 17:01:00.585729 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:00.585705 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-tbctd" Apr 17 17:01:00.588603 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:00.588565 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-p76pq\"" Apr 17 17:01:00.590752 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:00.590728 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-tbctd"] Apr 17 17:01:00.689772 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:00.689739 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjm98\" (UniqueName: \"kubernetes.io/projected/a41e7d1c-7975-44e9-b7cc-5c03386321e5-kube-api-access-rjm98\") pod \"kuadrant-operator-controller-manager-84b657d985-tbctd\" (UID: \"a41e7d1c-7975-44e9-b7cc-5c03386321e5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-tbctd" Apr 17 17:01:00.689977 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:00.689893 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a41e7d1c-7975-44e9-b7cc-5c03386321e5-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-tbctd\" (UID: \"a41e7d1c-7975-44e9-b7cc-5c03386321e5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-tbctd" Apr 17 17:01:00.790500 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:00.790470 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjm98\" (UniqueName: \"kubernetes.io/projected/a41e7d1c-7975-44e9-b7cc-5c03386321e5-kube-api-access-rjm98\") pod \"kuadrant-operator-controller-manager-84b657d985-tbctd\" (UID: \"a41e7d1c-7975-44e9-b7cc-5c03386321e5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-tbctd" Apr 17 17:01:00.790669 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:00.790602 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a41e7d1c-7975-44e9-b7cc-5c03386321e5-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-tbctd\" (UID: \"a41e7d1c-7975-44e9-b7cc-5c03386321e5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-tbctd" Apr 17 17:01:00.790969 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:00.790939 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a41e7d1c-7975-44e9-b7cc-5c03386321e5-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-tbctd\" (UID: \"a41e7d1c-7975-44e9-b7cc-5c03386321e5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-tbctd" Apr 17 17:01:00.800366 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:00.800339 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjm98\" (UniqueName: \"kubernetes.io/projected/a41e7d1c-7975-44e9-b7cc-5c03386321e5-kube-api-access-rjm98\") pod \"kuadrant-operator-controller-manager-84b657d985-tbctd\" (UID: \"a41e7d1c-7975-44e9-b7cc-5c03386321e5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-tbctd" Apr 17 17:01:00.866779 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:00.866671 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-tbctd"] Apr 17 17:01:00.867819 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:00.867796 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-tbctd" Apr 17 17:01:00.876022 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:00.875972 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-tbctd"] Apr 17 17:01:14.423736 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:14.423695 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-94rz7" event={"ID":"c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4","Type":"ContainerStarted","Data":"402b0a54eb2abbd0314b7dec2d8830737e66494f3f014063d7f3b008a2800b42"} Apr 17 17:01:14.450510 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:14.450459 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-94rz7" podStartSLOduration=0.957264131 podStartE2EDuration="24.450445161s" podCreationTimestamp="2026-04-17 17:00:50 +0000 UTC" firstStartedPulling="2026-04-17 17:00:50.793628483 +0000 UTC m=+495.700513277" lastFinishedPulling="2026-04-17 17:01:14.286809514 +0000 UTC m=+519.193694307" observedRunningTime="2026-04-17 17:01:14.446642957 +0000 UTC m=+519.353527772" watchObservedRunningTime="2026-04-17 17:01:14.450445161 +0000 UTC m=+519.357329977" Apr 17 17:01:16.745867 ip-10-0-134-88 kubenswrapper[2568]: W0417 17:01:16.745838 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda41e7d1c_7975_44e9_b7cc_5c03386321e5.slice/crio-6ed3e98f2f431259db546add5a8c1ed908469cab51a147b2011cb2bad57b9149 WatchSource:0}: Error finding container 6ed3e98f2f431259db546add5a8c1ed908469cab51a147b2011cb2bad57b9149: Status 404 returned error can't find the container with id 6ed3e98f2f431259db546add5a8c1ed908469cab51a147b2011cb2bad57b9149 Apr 17 17:01:22.456286 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:22.456256 2568 generic.go:358] "Generic (PLEG): container finished" podID="a41e7d1c-7975-44e9-b7cc-5c03386321e5" containerID="4adc70ed505787bdfb02463857de7fb6a82df0d8e5916c221ba5be298cb800f6" exitCode=1 Apr 17 17:01:22.461455 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:22.461429 2568 status_manager.go:895] "Failed to get status for pod" podUID="a41e7d1c-7975-44e9-b7cc-5c03386321e5" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-tbctd" err="pods \"kuadrant-operator-controller-manager-84b657d985-tbctd\" is forbidden: User \"system:node:ip-10-0-134-88.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-88.ec2.internal' and this object" Apr 17 17:01:22.511597 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:22.511574 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-tbctd" Apr 17 17:01:22.513868 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:22.513842 2568 status_manager.go:895] "Failed to get status for pod" podUID="a41e7d1c-7975-44e9-b7cc-5c03386321e5" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-tbctd" err="pods \"kuadrant-operator-controller-manager-84b657d985-tbctd\" is forbidden: User \"system:node:ip-10-0-134-88.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-88.ec2.internal' and this object" Apr 17 17:01:22.589457 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:22.589384 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a41e7d1c-7975-44e9-b7cc-5c03386321e5-extensions-socket-volume\") pod \"a41e7d1c-7975-44e9-b7cc-5c03386321e5\" (UID: \"a41e7d1c-7975-44e9-b7cc-5c03386321e5\") " Apr 17 17:01:22.589457 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:22.589453 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjm98\" (UniqueName: \"kubernetes.io/projected/a41e7d1c-7975-44e9-b7cc-5c03386321e5-kube-api-access-rjm98\") pod \"a41e7d1c-7975-44e9-b7cc-5c03386321e5\" (UID: \"a41e7d1c-7975-44e9-b7cc-5c03386321e5\") " Apr 17 17:01:22.589656 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:22.589633 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a41e7d1c-7975-44e9-b7cc-5c03386321e5-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "a41e7d1c-7975-44e9-b7cc-5c03386321e5" (UID: "a41e7d1c-7975-44e9-b7cc-5c03386321e5"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:01:22.589740 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:22.589725 2568 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a41e7d1c-7975-44e9-b7cc-5c03386321e5-extensions-socket-volume\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 17:01:22.591523 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:22.591504 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a41e7d1c-7975-44e9-b7cc-5c03386321e5-kube-api-access-rjm98" (OuterVolumeSpecName: "kube-api-access-rjm98") pod "a41e7d1c-7975-44e9-b7cc-5c03386321e5" (UID: "a41e7d1c-7975-44e9-b7cc-5c03386321e5"). InnerVolumeSpecName "kube-api-access-rjm98". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:01:22.691129 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:22.691096 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rjm98\" (UniqueName: \"kubernetes.io/projected/a41e7d1c-7975-44e9-b7cc-5c03386321e5-kube-api-access-rjm98\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 17:01:23.460884 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:23.460854 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-tbctd" Apr 17 17:01:23.460884 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:23.460877 2568 scope.go:117] "RemoveContainer" containerID="4adc70ed505787bdfb02463857de7fb6a82df0d8e5916c221ba5be298cb800f6" Apr 17 17:01:23.463931 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:23.463884 2568 status_manager.go:895] "Failed to get status for pod" podUID="a41e7d1c-7975-44e9-b7cc-5c03386321e5" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-tbctd" err="pods \"kuadrant-operator-controller-manager-84b657d985-tbctd\" is forbidden: User \"system:node:ip-10-0-134-88.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-88.ec2.internal' and this object" Apr 17 17:01:23.471482 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:23.471458 2568 status_manager.go:895] "Failed to get status for pod" podUID="a41e7d1c-7975-44e9-b7cc-5c03386321e5" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-tbctd" err="pods \"kuadrant-operator-controller-manager-84b657d985-tbctd\" is forbidden: User \"system:node:ip-10-0-134-88.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-88.ec2.internal' and this object" Apr 17 17:01:23.639669 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:23.639636 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a41e7d1c-7975-44e9-b7cc-5c03386321e5" path="/var/lib/kubelet/pods/a41e7d1c-7975-44e9-b7cc-5c03386321e5/volumes" Apr 17 17:01:45.736905 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:45.736872 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5qfjh"] Apr 17 17:01:45.737382 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:45.737275 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a41e7d1c-7975-44e9-b7cc-5c03386321e5" containerName="manager" Apr 17 17:01:45.737382 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:45.737288 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a41e7d1c-7975-44e9-b7cc-5c03386321e5" containerName="manager" Apr 17 17:01:45.737382 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:45.737342 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="a41e7d1c-7975-44e9-b7cc-5c03386321e5" containerName="manager" Apr 17 17:01:45.776403 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:45.776375 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5qfjh"] Apr 17 17:01:45.776558 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:45.776488 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-5qfjh" Apr 17 17:01:45.780553 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:45.780532 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 17:01:45.785532 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:45.785508 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62xlf\" (UniqueName: \"kubernetes.io/projected/85c2248e-103f-4555-968a-221b4672ab94-kube-api-access-62xlf\") pod \"limitador-limitador-78c99df468-5qfjh\" (UID: \"85c2248e-103f-4555-968a-221b4672ab94\") " pod="kuadrant-system/limitador-limitador-78c99df468-5qfjh" Apr 17 17:01:45.785645 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:45.785574 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/85c2248e-103f-4555-968a-221b4672ab94-config-file\") pod \"limitador-limitador-78c99df468-5qfjh\" (UID: \"85c2248e-103f-4555-968a-221b4672ab94\") " pod="kuadrant-system/limitador-limitador-78c99df468-5qfjh" Apr 17 17:01:45.787303 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:45.787287 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5qfjh"] Apr 17 17:01:45.886306 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:45.886270 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62xlf\" (UniqueName: \"kubernetes.io/projected/85c2248e-103f-4555-968a-221b4672ab94-kube-api-access-62xlf\") pod \"limitador-limitador-78c99df468-5qfjh\" (UID: \"85c2248e-103f-4555-968a-221b4672ab94\") " pod="kuadrant-system/limitador-limitador-78c99df468-5qfjh" Apr 17 17:01:45.886476 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:45.886325 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/85c2248e-103f-4555-968a-221b4672ab94-config-file\") pod \"limitador-limitador-78c99df468-5qfjh\" (UID: \"85c2248e-103f-4555-968a-221b4672ab94\") " pod="kuadrant-system/limitador-limitador-78c99df468-5qfjh" Apr 17 17:01:45.886913 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:45.886894 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/85c2248e-103f-4555-968a-221b4672ab94-config-file\") pod \"limitador-limitador-78c99df468-5qfjh\" (UID: \"85c2248e-103f-4555-968a-221b4672ab94\") " pod="kuadrant-system/limitador-limitador-78c99df468-5qfjh" Apr 17 17:01:45.900104 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:45.900074 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62xlf\" (UniqueName: \"kubernetes.io/projected/85c2248e-103f-4555-968a-221b4672ab94-kube-api-access-62xlf\") pod \"limitador-limitador-78c99df468-5qfjh\" (UID: \"85c2248e-103f-4555-968a-221b4672ab94\") " pod="kuadrant-system/limitador-limitador-78c99df468-5qfjh" Apr 17 17:01:46.087837 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:46.087739 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-5qfjh" Apr 17 17:01:46.219271 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:46.219243 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5qfjh"] Apr 17 17:01:46.219736 ip-10-0-134-88 kubenswrapper[2568]: W0417 17:01:46.219711 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85c2248e_103f_4555_968a_221b4672ab94.slice/crio-38f4cd367f7c955f0797c57355c0c5735931139283684951b6fab3c67ade528c WatchSource:0}: Error finding container 38f4cd367f7c955f0797c57355c0c5735931139283684951b6fab3c67ade528c: Status 404 returned error can't find the container with id 38f4cd367f7c955f0797c57355c0c5735931139283684951b6fab3c67ade528c Apr 17 17:01:46.402688 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:46.402620 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-xxxz4"] Apr 17 17:01:46.408506 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:46.408488 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-xxxz4" Apr 17 17:01:46.411135 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:46.411103 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-wrrmf\"" Apr 17 17:01:46.412021 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:46.411999 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-xxxz4"] Apr 17 17:01:46.490910 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:46.490880 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh64j\" (UniqueName: \"kubernetes.io/projected/a041a825-9d5a-42d9-9146-3da4095f0da5-kube-api-access-rh64j\") pod \"authorino-7498df8756-xxxz4\" (UID: \"a041a825-9d5a-42d9-9146-3da4095f0da5\") " pod="kuadrant-system/authorino-7498df8756-xxxz4" Apr 17 17:01:46.539387 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:46.539356 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-5qfjh" event={"ID":"85c2248e-103f-4555-968a-221b4672ab94","Type":"ContainerStarted","Data":"38f4cd367f7c955f0797c57355c0c5735931139283684951b6fab3c67ade528c"} Apr 17 17:01:46.591965 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:46.591909 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rh64j\" (UniqueName: \"kubernetes.io/projected/a041a825-9d5a-42d9-9146-3da4095f0da5-kube-api-access-rh64j\") pod \"authorino-7498df8756-xxxz4\" (UID: \"a041a825-9d5a-42d9-9146-3da4095f0da5\") " pod="kuadrant-system/authorino-7498df8756-xxxz4" Apr 17 17:01:46.600885 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:46.600858 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh64j\" (UniqueName: \"kubernetes.io/projected/a041a825-9d5a-42d9-9146-3da4095f0da5-kube-api-access-rh64j\") pod \"authorino-7498df8756-xxxz4\" (UID: \"a041a825-9d5a-42d9-9146-3da4095f0da5\") " pod="kuadrant-system/authorino-7498df8756-xxxz4" Apr 17 17:01:46.718441 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:46.718357 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-xxxz4" Apr 17 17:01:46.887935 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:46.887857 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-xxxz4"] Apr 17 17:01:46.895820 ip-10-0-134-88 kubenswrapper[2568]: W0417 17:01:46.895787 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda041a825_9d5a_42d9_9146_3da4095f0da5.slice/crio-b1a82b8a3c916574886d75c2bff818aca26d2974ca2f18424757c6c1be116e39 WatchSource:0}: Error finding container b1a82b8a3c916574886d75c2bff818aca26d2974ca2f18424757c6c1be116e39: Status 404 returned error can't find the container with id b1a82b8a3c916574886d75c2bff818aca26d2974ca2f18424757c6c1be116e39 Apr 17 17:01:47.546405 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:47.546364 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-xxxz4" event={"ID":"a041a825-9d5a-42d9-9146-3da4095f0da5","Type":"ContainerStarted","Data":"b1a82b8a3c916574886d75c2bff818aca26d2974ca2f18424757c6c1be116e39"} Apr 17 17:01:50.558776 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:50.558737 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-xxxz4" event={"ID":"a041a825-9d5a-42d9-9146-3da4095f0da5","Type":"ContainerStarted","Data":"d3cbf2df614ce838729ddc68ecf913d07c5721a2a65f97b6dda183be99f08b0a"} Apr 17 17:01:50.560082 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:50.560055 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-5qfjh" event={"ID":"85c2248e-103f-4555-968a-221b4672ab94","Type":"ContainerStarted","Data":"08f3a051cdfd361c5975e98ef016ff5696ccd4ce539cb6ff39dc12b7a8432119"} Apr 17 17:01:50.560232 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:50.560216 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-5qfjh" Apr 17 17:01:50.573780 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:50.573738 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-xxxz4" podStartSLOduration=1.61137067 podStartE2EDuration="4.573727002s" podCreationTimestamp="2026-04-17 17:01:46 +0000 UTC" firstStartedPulling="2026-04-17 17:01:46.899697574 +0000 UTC m=+551.806582369" lastFinishedPulling="2026-04-17 17:01:49.8620539 +0000 UTC m=+554.768938701" observedRunningTime="2026-04-17 17:01:50.572692139 +0000 UTC m=+555.479576957" watchObservedRunningTime="2026-04-17 17:01:50.573727002 +0000 UTC m=+555.480611817" Apr 17 17:01:50.589034 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:01:50.588986 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-5qfjh" podStartSLOduration=1.888195024 podStartE2EDuration="5.588970788s" podCreationTimestamp="2026-04-17 17:01:45 +0000 UTC" firstStartedPulling="2026-04-17 17:01:46.221561759 +0000 UTC m=+551.128446553" lastFinishedPulling="2026-04-17 17:01:49.922337513 +0000 UTC m=+554.829222317" observedRunningTime="2026-04-17 17:01:50.587766782 +0000 UTC m=+555.494651610" watchObservedRunningTime="2026-04-17 17:01:50.588970788 +0000 UTC m=+555.495855606" Apr 17 17:02:01.564962 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:01.564910 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-5qfjh" Apr 17 17:02:20.405803 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:20.405769 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7b98c979d4-sh6vq"] Apr 17 17:02:20.462321 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:20.462291 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7b98c979d4-sh6vq"] Apr 17 17:02:20.462321 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:20.462319 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5qfjh"] Apr 17 17:02:20.462505 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:20.462416 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7b98c979d4-sh6vq" Apr 17 17:02:20.465272 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:20.465252 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 17 17:02:20.586356 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:20.586316 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd7st\" (UniqueName: \"kubernetes.io/projected/10cc3c8b-abc7-454f-b8ad-e5c23b67d4d5-kube-api-access-kd7st\") pod \"authorino-7b98c979d4-sh6vq\" (UID: \"10cc3c8b-abc7-454f-b8ad-e5c23b67d4d5\") " pod="kuadrant-system/authorino-7b98c979d4-sh6vq" Apr 17 17:02:20.586535 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:20.586462 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/10cc3c8b-abc7-454f-b8ad-e5c23b67d4d5-tls-cert\") pod \"authorino-7b98c979d4-sh6vq\" (UID: \"10cc3c8b-abc7-454f-b8ad-e5c23b67d4d5\") " pod="kuadrant-system/authorino-7b98c979d4-sh6vq" Apr 17 17:02:20.687493 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:20.687408 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kd7st\" (UniqueName: \"kubernetes.io/projected/10cc3c8b-abc7-454f-b8ad-e5c23b67d4d5-kube-api-access-kd7st\") pod \"authorino-7b98c979d4-sh6vq\" (UID: \"10cc3c8b-abc7-454f-b8ad-e5c23b67d4d5\") " pod="kuadrant-system/authorino-7b98c979d4-sh6vq" Apr 17 17:02:20.687493 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:20.687468 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/10cc3c8b-abc7-454f-b8ad-e5c23b67d4d5-tls-cert\") pod \"authorino-7b98c979d4-sh6vq\" (UID: \"10cc3c8b-abc7-454f-b8ad-e5c23b67d4d5\") " pod="kuadrant-system/authorino-7b98c979d4-sh6vq" Apr 17 17:02:20.689836 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:20.689805 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/10cc3c8b-abc7-454f-b8ad-e5c23b67d4d5-tls-cert\") pod \"authorino-7b98c979d4-sh6vq\" (UID: \"10cc3c8b-abc7-454f-b8ad-e5c23b67d4d5\") " pod="kuadrant-system/authorino-7b98c979d4-sh6vq" Apr 17 17:02:20.695849 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:20.695824 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd7st\" (UniqueName: \"kubernetes.io/projected/10cc3c8b-abc7-454f-b8ad-e5c23b67d4d5-kube-api-access-kd7st\") pod \"authorino-7b98c979d4-sh6vq\" (UID: \"10cc3c8b-abc7-454f-b8ad-e5c23b67d4d5\") " pod="kuadrant-system/authorino-7b98c979d4-sh6vq" Apr 17 17:02:20.771570 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:20.771537 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7b98c979d4-sh6vq" Apr 17 17:02:20.890635 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:20.890605 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7b98c979d4-sh6vq"] Apr 17 17:02:20.894135 ip-10-0-134-88 kubenswrapper[2568]: W0417 17:02:20.894109 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10cc3c8b_abc7_454f_b8ad_e5c23b67d4d5.slice/crio-869e634dfec8e6ceabcf86b8b0f832fe8fabd02f3f69876217e679128219fae8 WatchSource:0}: Error finding container 869e634dfec8e6ceabcf86b8b0f832fe8fabd02f3f69876217e679128219fae8: Status 404 returned error can't find the container with id 869e634dfec8e6ceabcf86b8b0f832fe8fabd02f3f69876217e679128219fae8 Apr 17 17:02:21.668472 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:21.668429 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7b98c979d4-sh6vq" event={"ID":"10cc3c8b-abc7-454f-b8ad-e5c23b67d4d5","Type":"ContainerStarted","Data":"16fed4857a274d6fe82c68b9ff5716225415e7b7146f4bcea671009f32c33b33"} Apr 17 17:02:21.668472 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:21.668468 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7b98c979d4-sh6vq" event={"ID":"10cc3c8b-abc7-454f-b8ad-e5c23b67d4d5","Type":"ContainerStarted","Data":"869e634dfec8e6ceabcf86b8b0f832fe8fabd02f3f69876217e679128219fae8"} Apr 17 17:02:21.686337 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:21.686283 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7b98c979d4-sh6vq" podStartSLOduration=1.378993669 podStartE2EDuration="1.686269666s" podCreationTimestamp="2026-04-17 17:02:20 +0000 UTC" firstStartedPulling="2026-04-17 17:02:20.895846256 +0000 UTC m=+585.802731049" lastFinishedPulling="2026-04-17 17:02:21.203122249 +0000 UTC m=+586.110007046" observedRunningTime="2026-04-17 17:02:21.684709643 +0000 UTC m=+586.591594458" watchObservedRunningTime="2026-04-17 17:02:21.686269666 +0000 UTC m=+586.593154481" Apr 17 17:02:21.707183 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:21.707153 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-xxxz4"] Apr 17 17:02:21.707412 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:21.707382 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-xxxz4" podUID="a041a825-9d5a-42d9-9146-3da4095f0da5" containerName="authorino" containerID="cri-o://d3cbf2df614ce838729ddc68ecf913d07c5721a2a65f97b6dda183be99f08b0a" gracePeriod=30 Apr 17 17:02:21.940573 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:21.940552 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-xxxz4" Apr 17 17:02:22.099573 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:22.099541 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh64j\" (UniqueName: \"kubernetes.io/projected/a041a825-9d5a-42d9-9146-3da4095f0da5-kube-api-access-rh64j\") pod \"a041a825-9d5a-42d9-9146-3da4095f0da5\" (UID: \"a041a825-9d5a-42d9-9146-3da4095f0da5\") " Apr 17 17:02:22.101498 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:22.101467 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a041a825-9d5a-42d9-9146-3da4095f0da5-kube-api-access-rh64j" (OuterVolumeSpecName: "kube-api-access-rh64j") pod "a041a825-9d5a-42d9-9146-3da4095f0da5" (UID: "a041a825-9d5a-42d9-9146-3da4095f0da5"). InnerVolumeSpecName "kube-api-access-rh64j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:02:22.200542 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:22.200485 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rh64j\" (UniqueName: \"kubernetes.io/projected/a041a825-9d5a-42d9-9146-3da4095f0da5-kube-api-access-rh64j\") on node \"ip-10-0-134-88.ec2.internal\" DevicePath \"\"" Apr 17 17:02:22.674287 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:22.674249 2568 generic.go:358] "Generic (PLEG): container finished" podID="a041a825-9d5a-42d9-9146-3da4095f0da5" containerID="d3cbf2df614ce838729ddc68ecf913d07c5721a2a65f97b6dda183be99f08b0a" exitCode=0 Apr 17 17:02:22.674697 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:22.674307 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-xxxz4" Apr 17 17:02:22.674697 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:22.674327 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-xxxz4" event={"ID":"a041a825-9d5a-42d9-9146-3da4095f0da5","Type":"ContainerDied","Data":"d3cbf2df614ce838729ddc68ecf913d07c5721a2a65f97b6dda183be99f08b0a"} Apr 17 17:02:22.674697 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:22.674377 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-xxxz4" event={"ID":"a041a825-9d5a-42d9-9146-3da4095f0da5","Type":"ContainerDied","Data":"b1a82b8a3c916574886d75c2bff818aca26d2974ca2f18424757c6c1be116e39"} Apr 17 17:02:22.674697 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:22.674404 2568 scope.go:117] "RemoveContainer" containerID="d3cbf2df614ce838729ddc68ecf913d07c5721a2a65f97b6dda183be99f08b0a" Apr 17 17:02:22.682750 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:22.682731 2568 scope.go:117] "RemoveContainer" containerID="d3cbf2df614ce838729ddc68ecf913d07c5721a2a65f97b6dda183be99f08b0a" Apr 17 17:02:22.683004 ip-10-0-134-88 kubenswrapper[2568]: E0417 17:02:22.682984 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3cbf2df614ce838729ddc68ecf913d07c5721a2a65f97b6dda183be99f08b0a\": container with ID starting with d3cbf2df614ce838729ddc68ecf913d07c5721a2a65f97b6dda183be99f08b0a not found: ID does not exist" containerID="d3cbf2df614ce838729ddc68ecf913d07c5721a2a65f97b6dda183be99f08b0a" Apr 17 17:02:22.683083 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:22.683011 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3cbf2df614ce838729ddc68ecf913d07c5721a2a65f97b6dda183be99f08b0a"} err="failed to get container status \"d3cbf2df614ce838729ddc68ecf913d07c5721a2a65f97b6dda183be99f08b0a\": rpc error: code = NotFound desc = could not find container \"d3cbf2df614ce838729ddc68ecf913d07c5721a2a65f97b6dda183be99f08b0a\": container with ID starting with d3cbf2df614ce838729ddc68ecf913d07c5721a2a65f97b6dda183be99f08b0a not found: ID does not exist" Apr 17 17:02:22.695298 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:22.695263 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-xxxz4"] Apr 17 17:02:22.696770 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:22.696750 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-xxxz4"] Apr 17 17:02:23.638939 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:23.638892 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a041a825-9d5a-42d9-9146-3da4095f0da5" path="/var/lib/kubelet/pods/a041a825-9d5a-42d9-9146-3da4095f0da5/volumes" Apr 17 17:02:35.545913 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:35.545886 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-c7rtc_30e3b557-37e3-4fa9-9974-a7e12eff41fb/cluster-monitoring-operator/0.log" Apr 17 17:02:35.546998 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:35.546975 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-c7rtc_30e3b557-37e3-4fa9-9974-a7e12eff41fb/cluster-monitoring-operator/0.log" Apr 17 17:02:35.555417 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:35.555393 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q5p9p_f461b687-3271-484f-a873-6a5fb0b1214d/ovn-acl-logging/0.log" Apr 17 17:02:35.556031 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:35.556010 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q5p9p_f461b687-3271-484f-a873-6a5fb0b1214d/ovn-acl-logging/0.log" Apr 17 17:02:53.331852 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:02:53.331772 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5qfjh"] Apr 17 17:03:07.532858 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:03:07.532812 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5qfjh"] Apr 17 17:03:12.927351 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:03:12.927308 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5qfjh"] Apr 17 17:03:27.815850 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:03:27.815814 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5qfjh"] Apr 17 17:03:41.217346 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:03:41.217308 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5qfjh"] Apr 17 17:03:47.416005 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:03:47.415971 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5qfjh"] Apr 17 17:07:35.573116 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:07:35.573033 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-c7rtc_30e3b557-37e3-4fa9-9974-a7e12eff41fb/cluster-monitoring-operator/0.log" Apr 17 17:07:35.574853 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:07:35.574830 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-c7rtc_30e3b557-37e3-4fa9-9974-a7e12eff41fb/cluster-monitoring-operator/0.log" Apr 17 17:07:35.580155 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:07:35.580132 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q5p9p_f461b687-3271-484f-a873-6a5fb0b1214d/ovn-acl-logging/0.log" Apr 17 17:07:35.584595 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:07:35.584565 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q5p9p_f461b687-3271-484f-a873-6a5fb0b1214d/ovn-acl-logging/0.log" Apr 17 17:07:51.550316 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:07:51.550288 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7b98c979d4-sh6vq_10cc3c8b-abc7-454f-b8ad-e5c23b67d4d5/authorino/0.log" Apr 17 17:07:56.256149 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:07:56.256117 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6b98d9f7df-x62t4_f230a3a3-c8d5-4cda-acbb-010cbaf55e9a/manager/0.log" Apr 17 17:07:57.666508 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:07:57.666476 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7b98c979d4-sh6vq_10cc3c8b-abc7-454f-b8ad-e5c23b67d4d5/authorino/0.log" Apr 17 17:07:58.021167 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:07:58.021136 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-94rz7_c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4/kuadrant-console-plugin/0.log" Apr 17 17:07:58.367763 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:07:58.367672 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-5qfjh_85c2248e-103f-4555-968a-221b4672ab94/limitador/0.log" Apr 17 17:07:59.181184 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:07:59.181153 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-764cf74f-d8lnj_87df2db0-639a-4670-8958-d99942902884/kube-auth-proxy/0.log" Apr 17 17:08:04.427455 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:04.427419 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gbgn6/must-gather-696c8"] Apr 17 17:08:04.428013 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:04.427966 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a041a825-9d5a-42d9-9146-3da4095f0da5" containerName="authorino" Apr 17 17:08:04.428013 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:04.427988 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a041a825-9d5a-42d9-9146-3da4095f0da5" containerName="authorino" Apr 17 17:08:04.428136 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:04.428090 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="a041a825-9d5a-42d9-9146-3da4095f0da5" containerName="authorino" Apr 17 17:08:04.430790 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:04.430768 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gbgn6/must-gather-696c8" Apr 17 17:08:04.433326 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:04.433300 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gbgn6\"/\"kube-root-ca.crt\"" Apr 17 17:08:04.434432 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:04.434411 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gbgn6\"/\"openshift-service-ca.crt\"" Apr 17 17:08:04.434529 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:04.434417 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gbgn6\"/\"default-dockercfg-s7d7q\"" Apr 17 17:08:04.439376 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:04.439353 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gbgn6/must-gather-696c8"] Apr 17 17:08:04.553649 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:04.553622 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nz5n\" (UniqueName: \"kubernetes.io/projected/44f8b48d-2c5b-4500-bc91-4f42afc40aae-kube-api-access-4nz5n\") pod \"must-gather-696c8\" (UID: \"44f8b48d-2c5b-4500-bc91-4f42afc40aae\") " pod="openshift-must-gather-gbgn6/must-gather-696c8" Apr 17 17:08:04.553790 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:04.553657 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/44f8b48d-2c5b-4500-bc91-4f42afc40aae-must-gather-output\") pod \"must-gather-696c8\" (UID: \"44f8b48d-2c5b-4500-bc91-4f42afc40aae\") " pod="openshift-must-gather-gbgn6/must-gather-696c8" Apr 17 17:08:04.654875 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:04.654836 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nz5n\" (UniqueName: \"kubernetes.io/projected/44f8b48d-2c5b-4500-bc91-4f42afc40aae-kube-api-access-4nz5n\") pod \"must-gather-696c8\" (UID: \"44f8b48d-2c5b-4500-bc91-4f42afc40aae\") " pod="openshift-must-gather-gbgn6/must-gather-696c8" Apr 17 17:08:04.654875 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:04.654880 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/44f8b48d-2c5b-4500-bc91-4f42afc40aae-must-gather-output\") pod \"must-gather-696c8\" (UID: \"44f8b48d-2c5b-4500-bc91-4f42afc40aae\") " pod="openshift-must-gather-gbgn6/must-gather-696c8" Apr 17 17:08:04.655197 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:04.655182 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/44f8b48d-2c5b-4500-bc91-4f42afc40aae-must-gather-output\") pod \"must-gather-696c8\" (UID: \"44f8b48d-2c5b-4500-bc91-4f42afc40aae\") " pod="openshift-must-gather-gbgn6/must-gather-696c8" Apr 17 17:08:04.663570 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:04.663539 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nz5n\" (UniqueName: \"kubernetes.io/projected/44f8b48d-2c5b-4500-bc91-4f42afc40aae-kube-api-access-4nz5n\") pod \"must-gather-696c8\" (UID: \"44f8b48d-2c5b-4500-bc91-4f42afc40aae\") " pod="openshift-must-gather-gbgn6/must-gather-696c8" Apr 17 17:08:04.740706 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:04.740685 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gbgn6/must-gather-696c8" Apr 17 17:08:04.859869 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:04.859825 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gbgn6/must-gather-696c8"] Apr 17 17:08:04.862118 ip-10-0-134-88 kubenswrapper[2568]: W0417 17:08:04.862096 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44f8b48d_2c5b_4500_bc91_4f42afc40aae.slice/crio-25684cd28bdd1b37bf4dedf708c7ba7b63bedcc5a6b8b2b901ced4ff84046bd2 WatchSource:0}: Error finding container 25684cd28bdd1b37bf4dedf708c7ba7b63bedcc5a6b8b2b901ced4ff84046bd2: Status 404 returned error can't find the container with id 25684cd28bdd1b37bf4dedf708c7ba7b63bedcc5a6b8b2b901ced4ff84046bd2 Apr 17 17:08:04.863853 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:04.863831 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:08:05.830555 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:05.830530 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gbgn6/must-gather-696c8" event={"ID":"44f8b48d-2c5b-4500-bc91-4f42afc40aae","Type":"ContainerStarted","Data":"25684cd28bdd1b37bf4dedf708c7ba7b63bedcc5a6b8b2b901ced4ff84046bd2"} Apr 17 17:08:06.837905 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:06.837862 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gbgn6/must-gather-696c8" event={"ID":"44f8b48d-2c5b-4500-bc91-4f42afc40aae","Type":"ContainerStarted","Data":"e3204a7037ac8ae8a8d8cb7572c83bd26025293d89e4b9837c969ae5cd9fc8ed"} Apr 17 17:08:06.838459 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:06.838430 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gbgn6/must-gather-696c8" event={"ID":"44f8b48d-2c5b-4500-bc91-4f42afc40aae","Type":"ContainerStarted","Data":"001b41399901a69c38181fe176210ea7aff3b7a1380b01ebdefdf027d260ba65"} Apr 17 17:08:06.856376 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:06.856312 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gbgn6/must-gather-696c8" podStartSLOduration=1.9972011379999999 podStartE2EDuration="2.856293946s" podCreationTimestamp="2026-04-17 17:08:04 +0000 UTC" firstStartedPulling="2026-04-17 17:08:04.863977784 +0000 UTC m=+929.770862579" lastFinishedPulling="2026-04-17 17:08:05.723070589 +0000 UTC m=+930.629955387" observedRunningTime="2026-04-17 17:08:06.854409727 +0000 UTC m=+931.761294548" watchObservedRunningTime="2026-04-17 17:08:06.856293946 +0000 UTC m=+931.763178762" Apr 17 17:08:07.309375 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:07.309333 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-jgf62_e15ee7f9-dfd3-4121-89e2-a4eefd35413e/global-pull-secret-syncer/0.log" Apr 17 17:08:07.417881 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:07.417848 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-h5kgg_747af7cf-1df7-4cfd-8bb2-841945c9fd3e/konnectivity-agent/0.log" Apr 17 17:08:07.472103 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:07.472067 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-88.ec2.internal_b1ba083eb7d02ba302027e7be390bc5c/haproxy/0.log" Apr 17 17:08:11.303767 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:11.303734 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7b98c979d4-sh6vq_10cc3c8b-abc7-454f-b8ad-e5c23b67d4d5/authorino/0.log" Apr 17 17:08:11.373278 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:11.373244 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-94rz7_c7cf22e0-3e00-479d-8f65-e3d46f6f5bb4/kuadrant-console-plugin/0.log" Apr 17 17:08:11.478503 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:11.478464 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-5qfjh_85c2248e-103f-4555-968a-221b4672ab94/limitador/0.log" Apr 17 17:08:12.914987 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:12.914593 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-c7rtc_30e3b557-37e3-4fa9-9974-a7e12eff41fb/cluster-monitoring-operator/1.log" Apr 17 17:08:13.094941 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.094467 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-c7rtc_30e3b557-37e3-4fa9-9974-a7e12eff41fb/cluster-monitoring-operator/0.log" Apr 17 17:08:13.123808 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.123689 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-64ttp_69985995-b454-4287-8e16-ad76e4c4e3e3/kube-state-metrics/0.log" Apr 17 17:08:13.146678 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.146644 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-64ttp_69985995-b454-4287-8e16-ad76e4c4e3e3/kube-rbac-proxy-main/0.log" Apr 17 17:08:13.169696 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.169619 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-64ttp_69985995-b454-4287-8e16-ad76e4c4e3e3/kube-rbac-proxy-self/0.log" Apr 17 17:08:13.223323 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.223278 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-fqfhp_0062c6f7-d03d-47c2-b629-6c7c3639acb9/monitoring-plugin/0.log" Apr 17 17:08:13.384720 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.384695 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zpgj6_8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c/node-exporter/0.log" Apr 17 17:08:13.406643 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.406612 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zpgj6_8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c/kube-rbac-proxy/0.log" Apr 17 17:08:13.427256 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.427099 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zpgj6_8cbfc4c3-65fb-4b43-a0f8-4b26ca71366c/init-textfile/0.log" Apr 17 17:08:13.458143 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.458115 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-55zhx_e38f13c7-0aae-46bc-9fad-7df2e1de3aa2/kube-rbac-proxy-main/0.log" Apr 17 17:08:13.476102 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.476018 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-55zhx_e38f13c7-0aae-46bc-9fad-7df2e1de3aa2/kube-rbac-proxy-self/0.log" Apr 17 17:08:13.497734 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.497696 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-55zhx_e38f13c7-0aae-46bc-9fad-7df2e1de3aa2/openshift-state-metrics/0.log" Apr 17 17:08:13.535515 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.535476 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_728f29e0-5be4-46bb-ac36-eaa4660503ad/prometheus/0.log" Apr 17 17:08:13.552290 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.552262 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_728f29e0-5be4-46bb-ac36-eaa4660503ad/config-reloader/0.log" Apr 17 17:08:13.569410 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.569378 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_728f29e0-5be4-46bb-ac36-eaa4660503ad/thanos-sidecar/0.log" Apr 17 17:08:13.593890 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.593853 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_728f29e0-5be4-46bb-ac36-eaa4660503ad/kube-rbac-proxy-web/0.log" Apr 17 17:08:13.611542 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.611511 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_728f29e0-5be4-46bb-ac36-eaa4660503ad/kube-rbac-proxy/0.log" Apr 17 17:08:13.628413 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.628376 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_728f29e0-5be4-46bb-ac36-eaa4660503ad/kube-rbac-proxy-thanos/0.log" Apr 17 17:08:13.658619 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.658574 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_728f29e0-5be4-46bb-ac36-eaa4660503ad/init-config-reloader/0.log" Apr 17 17:08:13.685592 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.685509 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-f8m8f_14a9d6ea-9985-48e3-969f-8f976c314970/prometheus-operator/0.log" Apr 17 17:08:13.702128 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.702100 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-f8m8f_14a9d6ea-9985-48e3-969f-8f976c314970/kube-rbac-proxy/0.log" Apr 17 17:08:13.729786 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.729757 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-q59r6_2ccace76-e6f7-4577-8d52-ce5ff0fb350e/prometheus-operator-admission-webhook/0.log" Apr 17 17:08:13.832069 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.832037 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-757c9669d9-7849l_84be4b0e-4b66-4061-a3d5-3a8708d4255b/thanos-query/0.log" Apr 17 17:08:13.851018 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.850975 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-757c9669d9-7849l_84be4b0e-4b66-4061-a3d5-3a8708d4255b/kube-rbac-proxy-web/0.log" Apr 17 17:08:13.872946 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.872897 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-757c9669d9-7849l_84be4b0e-4b66-4061-a3d5-3a8708d4255b/kube-rbac-proxy/0.log" Apr 17 17:08:13.892043 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.892012 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-757c9669d9-7849l_84be4b0e-4b66-4061-a3d5-3a8708d4255b/prom-label-proxy/0.log" Apr 17 17:08:13.910229 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.910200 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-757c9669d9-7849l_84be4b0e-4b66-4061-a3d5-3a8708d4255b/kube-rbac-proxy-rules/0.log" Apr 17 17:08:13.932459 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:13.932428 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-757c9669d9-7849l_84be4b0e-4b66-4061-a3d5-3a8708d4255b/kube-rbac-proxy-metrics/0.log" Apr 17 17:08:15.090301 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:15.090268 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-js8f5_b8319ed2-b789-4cb0-969b-0ef6032e8f49/networking-console-plugin/0.log" Apr 17 17:08:15.829544 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:15.829508 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8"] Apr 17 17:08:15.835503 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:15.835475 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" Apr 17 17:08:15.844390 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:15.844233 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8"] Apr 17 17:08:15.883535 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:15.883166 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81-proc\") pod \"perf-node-gather-daemonset-946b8\" (UID: \"ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" Apr 17 17:08:15.883535 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:15.883241 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81-sys\") pod \"perf-node-gather-daemonset-946b8\" (UID: \"ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" Apr 17 17:08:15.883535 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:15.883294 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81-lib-modules\") pod \"perf-node-gather-daemonset-946b8\" (UID: \"ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" Apr 17 17:08:15.883535 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:15.883331 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81-podres\") pod \"perf-node-gather-daemonset-946b8\" (UID: \"ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" Apr 17 17:08:15.883535 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:15.883358 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fwc9\" (UniqueName: \"kubernetes.io/projected/ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81-kube-api-access-2fwc9\") pod \"perf-node-gather-daemonset-946b8\" (UID: \"ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" Apr 17 17:08:15.984191 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:15.984161 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81-proc\") pod \"perf-node-gather-daemonset-946b8\" (UID: \"ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" Apr 17 17:08:15.984367 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:15.984208 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81-sys\") pod \"perf-node-gather-daemonset-946b8\" (UID: \"ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" Apr 17 17:08:15.984367 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:15.984247 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81-lib-modules\") pod \"perf-node-gather-daemonset-946b8\" (UID: \"ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" Apr 17 17:08:15.984367 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:15.984269 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81-podres\") pod \"perf-node-gather-daemonset-946b8\" (UID: \"ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" Apr 17 17:08:15.984367 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:15.984275 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81-proc\") pod \"perf-node-gather-daemonset-946b8\" (UID: \"ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" Apr 17 17:08:15.984367 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:15.984292 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fwc9\" (UniqueName: \"kubernetes.io/projected/ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81-kube-api-access-2fwc9\") pod \"perf-node-gather-daemonset-946b8\" (UID: \"ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" Apr 17 17:08:15.984367 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:15.984342 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81-sys\") pod \"perf-node-gather-daemonset-946b8\" (UID: \"ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" Apr 17 17:08:15.984616 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:15.984418 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81-podres\") pod \"perf-node-gather-daemonset-946b8\" (UID: \"ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" Apr 17 17:08:15.984616 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:15.984427 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81-lib-modules\") pod \"perf-node-gather-daemonset-946b8\" (UID: \"ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" Apr 17 17:08:15.992178 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:15.992158 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fwc9\" (UniqueName: \"kubernetes.io/projected/ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81-kube-api-access-2fwc9\") pod \"perf-node-gather-daemonset-946b8\" (UID: \"ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" Apr 17 17:08:16.149439 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:16.149364 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" Apr 17 17:08:16.285907 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:16.285882 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8"] Apr 17 17:08:16.288217 ip-10-0-134-88 kubenswrapper[2568]: W0417 17:08:16.288186 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podccccacb4_27b3_49f0_91c6_ec1bb6cb8d81.slice/crio-a8780879e433125333f835475a97a842b1c96d6a5497f156e4fe0b269dfc332e WatchSource:0}: Error finding container a8780879e433125333f835475a97a842b1c96d6a5497f156e4fe0b269dfc332e: Status 404 returned error can't find the container with id a8780879e433125333f835475a97a842b1c96d6a5497f156e4fe0b269dfc332e Apr 17 17:08:16.898879 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:16.898840 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" event={"ID":"ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81","Type":"ContainerStarted","Data":"ce522f5f6afae4c93a1f5c82d9e850dc0c419eb19b8c4c97294a48b5e95ff38f"} Apr 17 17:08:16.899089 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:16.898887 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" event={"ID":"ccccacb4-27b3-49f0-91c6-ec1bb6cb8d81","Type":"ContainerStarted","Data":"a8780879e433125333f835475a97a842b1c96d6a5497f156e4fe0b269dfc332e"} Apr 17 17:08:16.899362 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:16.899332 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" Apr 17 17:08:16.914875 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:16.914828 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" podStartSLOduration=1.9148154590000002 podStartE2EDuration="1.914815459s" podCreationTimestamp="2026-04-17 17:08:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:08:16.913444838 +0000 UTC m=+941.820329655" watchObservedRunningTime="2026-04-17 17:08:16.914815459 +0000 UTC m=+941.821700275" Apr 17 17:08:17.418174 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:17.418145 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bc8nx_b9cacbec-64af-43d7-85d4-fde767a1cfa3/dns/0.log" Apr 17 17:08:17.433762 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:17.433728 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bc8nx_b9cacbec-64af-43d7-85d4-fde767a1cfa3/kube-rbac-proxy/0.log" Apr 17 17:08:17.525198 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:17.525158 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pw92n_205623d6-59f4-4e27-8196-83daaf7a9d26/dns-node-resolver/0.log" Apr 17 17:08:17.981103 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:17.981070 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ndxqw_66341dd8-b441-446a-be14-71280c6960b2/node-ca/0.log" Apr 17 17:08:18.795110 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:18.795079 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-764cf74f-d8lnj_87df2db0-639a-4670-8958-d99942902884/kube-auth-proxy/0.log" Apr 17 17:08:19.378796 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:19.378762 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-k7gjh_707c1483-2d89-4157-80d6-4356800a454b/serve-healthcheck-canary/0.log" Apr 17 17:08:19.798572 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:19.798540 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-lttq7_4a8134e0-b6b3-45d5-a4b4-a9b544913d40/insights-operator/0.log" Apr 17 17:08:19.799017 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:19.798730 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-lttq7_4a8134e0-b6b3-45d5-a4b4-a9b544913d40/insights-operator/1.log" Apr 17 17:08:19.929513 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:19.929482 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jh5wg_ea016832-b0d1-48d2-ac01-0d47d66fdf3e/kube-rbac-proxy/0.log" Apr 17 17:08:19.952462 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:19.952428 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jh5wg_ea016832-b0d1-48d2-ac01-0d47d66fdf3e/exporter/0.log" Apr 17 17:08:19.972325 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:19.972297 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jh5wg_ea016832-b0d1-48d2-ac01-0d47d66fdf3e/extractor/0.log" Apr 17 17:08:21.825284 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:21.825254 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6b98d9f7df-x62t4_f230a3a3-c8d5-4cda-acbb-010cbaf55e9a/manager/0.log" Apr 17 17:08:22.866847 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:22.866819 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-58fcc7cb5-hr5x9_c8b63800-1697-4644-bece-e40e2ff7d4e1/manager/0.log" Apr 17 17:08:22.915024 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:22.915000 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-946b8" Apr 17 17:08:27.439567 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:27.439514 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-td7gx_36f13da0-be40-496a-a83a-c62049f5690b/kube-storage-version-migrator-operator/1.log" Apr 17 17:08:27.441343 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:27.441314 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-td7gx_36f13da0-be40-496a-a83a-c62049f5690b/kube-storage-version-migrator-operator/0.log" Apr 17 17:08:28.490033 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:28.490008 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ct6wx_561a91fc-d084-4461-b5bf-ad5bc1ac7a9e/kube-multus-additional-cni-plugins/0.log" Apr 17 17:08:28.506323 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:28.506297 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ct6wx_561a91fc-d084-4461-b5bf-ad5bc1ac7a9e/egress-router-binary-copy/0.log" Apr 17 17:08:28.522437 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:28.522411 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ct6wx_561a91fc-d084-4461-b5bf-ad5bc1ac7a9e/cni-plugins/0.log" Apr 17 17:08:28.542416 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:28.542387 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ct6wx_561a91fc-d084-4461-b5bf-ad5bc1ac7a9e/bond-cni-plugin/0.log" Apr 17 17:08:28.559296 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:28.559271 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ct6wx_561a91fc-d084-4461-b5bf-ad5bc1ac7a9e/routeoverride-cni/0.log" Apr 17 17:08:28.576159 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:28.576141 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ct6wx_561a91fc-d084-4461-b5bf-ad5bc1ac7a9e/whereabouts-cni-bincopy/0.log" Apr 17 17:08:28.591309 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:28.591285 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ct6wx_561a91fc-d084-4461-b5bf-ad5bc1ac7a9e/whereabouts-cni/0.log" Apr 17 17:08:28.779226 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:28.779155 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xwgv2_bcabc9cc-357b-429a-80c7-605b1281122f/kube-multus/0.log" Apr 17 17:08:28.833839 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:28.833762 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bcjnr_1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3/network-metrics-daemon/0.log" Apr 17 17:08:28.847913 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:28.847883 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bcjnr_1e75e15a-4a21-46fc-8ab6-d31ca6ee91f3/kube-rbac-proxy/0.log" Apr 17 17:08:29.943393 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:29.943354 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q5p9p_f461b687-3271-484f-a873-6a5fb0b1214d/ovn-controller/0.log" Apr 17 17:08:29.989523 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:29.989493 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q5p9p_f461b687-3271-484f-a873-6a5fb0b1214d/ovn-acl-logging/0.log" Apr 17 17:08:29.998177 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:29.998157 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q5p9p_f461b687-3271-484f-a873-6a5fb0b1214d/ovn-acl-logging/1.log" Apr 17 17:08:30.040233 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:30.040212 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q5p9p_f461b687-3271-484f-a873-6a5fb0b1214d/kube-rbac-proxy-node/0.log" Apr 17 17:08:30.061907 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:30.061887 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q5p9p_f461b687-3271-484f-a873-6a5fb0b1214d/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 17:08:30.074819 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:30.074796 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q5p9p_f461b687-3271-484f-a873-6a5fb0b1214d/northd/0.log" Apr 17 17:08:30.090360 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:30.090341 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q5p9p_f461b687-3271-484f-a873-6a5fb0b1214d/nbdb/0.log" Apr 17 17:08:30.134271 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:30.134254 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q5p9p_f461b687-3271-484f-a873-6a5fb0b1214d/sbdb/0.log" Apr 17 17:08:30.312343 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:30.312291 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q5p9p_f461b687-3271-484f-a873-6a5fb0b1214d/ovnkube-controller/0.log" Apr 17 17:08:31.609218 ip-10-0-134-88 kubenswrapper[2568]: I0417 17:08:31.609184 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-blsng_33d21ed2-8e33-49bf-a161-a1a1a93a72d8/network-check-target-container/0.log"