Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wonder if the author is aware of CVX / cvxpy / "disciplined convex programming"?

I think this would be easy to express in that, and I wonder how the performance would compare. Basically, there can be DSL-level facilities to track that functions and constraints are convex by construction, and remove the obligation to implement various bits of optimization logic (and allow one to switch between optimization details at a config level).

https://www.cvxpy.org/ https://cvxr.com/cvx/



Good pointer! I did know about CVXPY but haven't used it yet. Though in requiring everything to be (log-)convex it constrains the user programs quite a bit, but I imagine it would have worked in this case because the objective is linear and the feasible set convex. I won't deny the role accident and force of habit had in picking this particular implementation path; I used Pytorch because it's my daily driver :D and because I had found 'mctorch' which provided the Riemann SGD logic.


Actually, you can solve the assignment problem as a linear program (LP). Just do

    min sum_i sum_j c_{ij}*x_{ij}

    s.t. sum_i x_{ij} == 1, for all j
         sum_j x_{ij} == 1, for all i
and you automagically get an integer solution in the x_{ij} due to the structure of the constraints.

CVXPY would be a good way to implement this these days.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: