At Princeton I was fortunate to work with Prof. Amir Ali Ahmadi and Prof. Georgina Hall. We were interested in developing a technique for incorporating strict monotonicity guarantees for polynomial regression on compact sets. Monotonicity is a common characteristic in practice, for example, it can be a way to incorporate prior knowledge, encode fairness criteria, and ensure robustness and generalization. We used constrained polynomial optimization to formulate the problem of learning feature-monotone functions. To provide theoretical validation, we proved a Weierstrass-like Polynomial Approximation theorem restricted to monotone multivariate functions. We showed that learning monotonically constrained polynomial functions is NP-hard. To address this complexity of we proposed a relaxation to a constrained Sum of Squares optimization problem, which we subsequently solved via Semidefinite Programming.
Currently we are extending the framework to incorporate more complex shape constraints of the regressand. In practice, one of such key constraints is convexity with respect to a set of variables. Currently we have a manuscript under review at Operations Research Journal which describes a general approach for “Shape Constrained Regression using Sum of Squares Polynomials”. See unpublished work here.