An optimization problem for continuous submodular functions

Laszlo Csirmaz


Real continuous submodular functions, as a generalization of the corresponding discrete notion to the continuous domain, gained considerable attention recently. The analog notion for entropy functions requires additional properties: a real function defined on the non-negative orthant of $\R^n$ is entropy-like (EL) if it is submodular, takes zero at zero, non-decreasing, and has the Diminishing Returns property.
Motivated by problems concerning the Shannon complexity of multipartite secret sharing, a special case of the following general optimization problem is considered: find the minimal cost of those EL functions which satisfy certain constraints.
In our special case the cost of an EL function is the maximal value of the $n$ partial derivatives at zero. Another possibility could be the supremum of the function range. The constraints are specified by a smooth bounded surface $S$ cutting off a downward closed subset. An EL function is feasible if at the internal points of $S$ the left and right  partial derivatives of the function differ by at least one.
A general lower bound for the minimal cost is given in terms of the normals of the surface $S$. The bound is tight when $S$ is linear. In the two-dimensional case the same bound is tight for convex or concave $S$. It is shown that the optimal EL function is not necessarily unique. The paper concludes with several open problems.


submodular optimization; entropy method; secret sharing

Full Text:



Francis Bach.

newblock {em Learning with Submodular Functions: A Convex Optimization


newblock Foundations and Trends in Machine Learning. {Now Publishers}, 2013.

Francis Bach.

newblock Submodular functions: from discrete to continuous domains.

newblock {em Mathematical Programming}, 175(1-2):419--459, 2019.

Amos Beimel.

newblock Secret-sharing schemes: A survey.

newblock In {em Proceedings of the Third International Conference on Coding

and Cryptology}, IWCC’11, page 11–46, Berlin, Heidelberg, 2011.


Yatao Bian, Joachim~M Buhmann, and Andreas Krause.

newblock Continuous submodular function maximization.

newblock {em arXiv preprint arXiv:2006.13474}, 2020.

L'{e}on Botton, Frank~E. Curtis, and Jorge Nocedal.

newblock Optimization methods for large-scale machine learning.

newblock {em Siam Review}, 60(2):223--311, 2018.

Lin Chen, Hamed Hassani, and Amin Karbasi.

newblock Online continuous submodular maximization.

newblock In {em AISTATS}, Playa Blanca, Lanzarote, Canary Islands, April

Laszlo Csirmaz, Frantisek Mat'{u}s, and Carles Padr'{o}.

newblock Bipartite secret sharing and staircases.

newblock Unpublished manuscript, 2020.

Oriol Farr{`a}s, Jaume Mart'{i}-Farr{'e}, and Carles Padr{'o}.

newblock Ideal multipartite secret sharing schemes.

newblock {em J. Cryptology}, 25(3):434--463, 2012.

Oriol Farr{`a}s, Jessica~Ruth Metcalf-Burton, Carles Padr{'o}, and Leonor


newblock On the optimization of bipartite secret sharing schemes.

newblock {em Des. Codes Cryptography}, 63(2):255--271, 2012.

Jessica~Ruth Metcalf-Burton.

newblock Information rates of minimal non-matroid-related access structures.

newblock {em CoRR}, 2008.

Walter Rudin.

newblock {em Principles of Mathematical Analysis}.

newblock McGraw-Hill Book Co., New York, 3rd edition, 1976.

newblock International Series in Pure and Applied Mathematics.

Matthew Staib and Stefanie Jegelka.

newblock Robust budget allocation via continuous submodular functions.

newblock In Doina Precup and Yee~Whye Teh, editors, {em Proceedings of the

th International Conference on Machine Learning, {ICML} 2017, Sydney, NSW,

Australia, 6-11 August 2017}, volume~70 of {em Proceedings of Machine

Learning Research}, pages 3230--3240. {PMLR}, 2017.

R.W. Yeung.

newblock {em A First Course in Information Theory}.

newblock Information Technology: Transmission, Processing and Storage.

Springer US, 2012.



  • There are currently no refbacks.