Emergent Mind

Abstract

In many modern data sets, High dimension low sample size (HDLSS) data is prevalent in many fields of studies. There has been an increased focus recently on using machine learning and statistical methods to mine valuable information out of these data sets. Thus, there has been an increased interest in efficient learning in high dimensions. Naturally, as the dimension of the input data increases, the learning task will become more difficult, due to increasing computational and statistical complexities. This makes it crucial to overcome the curse of dimensionality in a given dataset, within a reasonable time frame, in a bid to obtain the insights required to keep a competitive edge. To solve HDLSS problems, classical methods such as support vector machines can be utilised to alleviate data piling at the margin. However, when we question geometric domains and their assumptions on input data, we are naturally lead to convex optimisation problems and this gives rise to the development of solutions like distance weighted discrimination (DWD), which can be modelled as a second-order cone programming problem and solved by interior-point methods when sample size and feature dimensions of the data is moderate. In this paper, our focus is on designing an even more scalable and robust algorithm for solving large-scale generalized DWD problems.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.