In machine learning, support vector machines (SVM) are a well-proven family of algorithms for predicting binary class labels. The heart of the algorithm is linear regression with a particular loss function, the so-called margin loss. A variant called support vector regression (SVR) is used to predict scalar values instead and has a loss function sometimes referred to as epsilon-insensitive loss.
We believe that some problems in machine learning might benefit from combining these two types of loss functions. Similar ideas have been published before (https://arxiv.org/abs/1106.3397), but have received little attention so far, meaning this type of loss function has not been thoroughly studied, and is not readily available in open-source software packages.
The master thesis work focuses on the following
- Implement an SVM/SVR type algorithm with this hybrid loss function,
- either in an existing framework (e.g. PyTorch, scikit-learn or OpenCV),
- or from scratch (e.g. in Python or C++).
- Analyze the performance of this hybrid approach compared to conventional SVM or SVR, in terms of
- prediction accuracy, as well as
- convergence speed and robustness.
- Try to characterize for what kind of problems, if any, this approach gives benefits.
There are possibilities to shape this project according to your interests, some options could be:
- Contribute the implementation back to an open-source framework.
- Summarize your findings in a research article.
- Survey the literature for similar algorithms, and focus on a broader analysis.
- An interest in understanding algorithms.
- Familiarity with mathematical optimization or machine learning.
- Some programming experience, not necessarily in the languages above.
- A welcoming company culture.
- Supervisor with many years of experience in thesis supervision.
For more information about the position, contact:
Erik Hedberg, Algorithm developer, email@example.com
Charlotte Axelsson, HR Manager, +46 739 20 99 50.
Welcome with your application 15th of October at the latest!