Torch dependency & Callback functionality #181
Replies: 2 comments
-
@TimOrtkamp Thanks very much for your kindly suggestions. We have removed torch for the latest version v0.0.79. Now the memory requirements have been significantly reduced, in order to better meet your application requirements. In the future, we will add the seemingly necessary callback functionality, according to your good suggestion. |
Beta Was this translation helpful? Give feedback.
-
@Evolutionary-Intelligence Fantastic! I've spend the day re-integrating your package into mine, and now it is running as desired :) I've added references to your documentation page and preprint within the respective wrapper files. Also, I'm looking forward to the addition of a callback functionality! Maybe a follow-up question: from the algorithms implemented in PyPop7, are there specific ones you would recommend in case of a high-dimensional (>5000 variables), nonlinear and partially non-convex optimization problem. So far, I've made some good experiences with LMCMA and LMMAES, but it would be interesting to know if there's any further options to test (without having to go through all available large-scale optimization algorithms) ;) |
Beta Was this translation helpful? Give feedback.
-
Hello PyPop7-Devs,
I'm the main developer of pyanno4rt, a package for radiotherapy dose optimization, and I've tested a few algorithms from your package. Really interesting selection of algorithms, and even though my optimization problems are quite high-dimensional (>5000 variables), the computational speed seems pretty good, which is why I'm seriously thinking about integrating your package as a dependency in the first release of pyanno4rt! However, currently there's two major pitfalls for me, maybe you got a solution here:
Thanks for an answer and for your impressive work!
Regards,
Tim
Beta Was this translation helpful? Give feedback.
All reactions