Leonid Berlyand

From Wikipedia, the free encyclopedia

Leonid Berlyand
Born
Alma materKharkiv State University
Known forworks on homogenization
AwardsHumboldt Prize
Scientific career
FieldsApplied mathematics, homogenization, mathematical biology, deep learning
InstitutionsKharkiv State University, Semenov Institute of Chemical Physics, Penn State University
Thesis Homogenization of Elasticity Equations in Domains with Fine-Grained Boundaries
Doctoral advisorEvgeny Khruslov

Leonid Berlyand is a Soviet and American mathematician, a professor of Penn State University. He is known for his works on homogenization, Ginzburg–Landau theory, mathematical modeling of active matter and mathematical foundations of deep learning.

Leonid Berlyand was born in Kharkiv, Ukraine. His father, Viktor Berlyand, was a mechanical engineer, and his mother, Mayya Genkina, an electronics engineer. He graduated from the department of mathematics and mechanics at the V. N. Karazin Kharkiv National University. Then he obtained his Ph. D. degree from the same university. In his Ph. D. thesis, he applied the homogenization theory to study of elastic composite materials. After his defence, he worked at the Semenov Institute of Chemical Physics in Moscow. In 1991 he moved to the United States and started working at Pennsylvania State University, where he has served as a full professor since 2003. He has held long-term visiting positions at the Collège de France, Princeton University, the California Institute of Technology, the University of Chicago, the Max Planck Institute for Mathematics in the Sciences, Sorbonne University, Heidelberg University, Argonne and Los Alamos National Laboratories. His research has drawn support from the National Science Foundation(NSF),[1] NIH/NIGMS,[2] the Applied Mathematics Program of the DOE Office of Sciences,[3] BSF (the Bi-National Science Foundation USA-Israel)[4] and the NATO Science for Peace and Security Section. Berlyand has authored over 100 works on homogenization theory and PDE/variational problems in biology and material science. He has organized a number of professional conferences and serves as one of two founding co-directors of the Center for Mathematics of Living and Mimetic Matter and the Center for Interdisciplinary Mathematics at Penn State University. He has supervised 16 Ph. D. students and 8 postdoctoral fellows.[5][6]

Research

Drawing upon fundamental works in classical homogenization theory, Berlyand advanced the methods of homogenization in many versatile applications. He obtained mathematical results applicable to diverse scientific areas including biology, fluid mechanics, superconductivity, elasticity, and material science. His mathematical modeling explains striking experimental result in the collective swimming of bacteria.[7] His homogenization approach to multi-scale problems was transformed into a practical computational tool by introducing a concept of polyharmonic homogenization which led to a new type of multiscale finite elements.[8] Together with H. Owhadi, he introduced a "transfer-of-approximation" modeling concept, based on the similarity of the asymptotic behavior of the errors of Galerkin solutions for two elliptic PDEs.[9][10] He also contributed to mathematical aspects of the Ginzburg–Landau theory of superconductivity/superfluidity by introducing a new class of semi-stiff boundary problems.[11]

In 2020s Berlyand got interested in the rapidly developing area of deep learning. His focus has been on mathematical foundations of deep learning, specifically he addressed the mathematical issues of convergence and stability of training algorithms for deep neural networks (DNNs)[12] and formation of fixed points of DNNs.[13] Subsequently he worked on application of the Marchenko-Pastur distribution of Random Matrix Theory to pruning of DNNs that drastically improves training efficiency.[14] Berlyand also studied existence and stability of fixed points of autoencoder neural networks. This work was done in collaboration with his Ukrainian colleagues.[13] In 2023 he published a textbook on introduction to mathematics of deep learning together with Pierre-Emmanuel Jabin.[15]

Awards and honors

Membership in professional associations

Editorship

Books (author)

  • "Introduction to Network Approximation for Materials Modeling" (with A. Kolpakov and A. Novikov), Cambridge University Press, 2012.
  • "Getting Acquainted with Homogenization and Multiscale" (with V. Rybalko), part of the Compact Textbooks in Mathematics book series, Springer, 2018.
  • "Mathematics of Deep Learning. An Introduction" (with P.-E. Jabin) De Gruyter, In the series De Gruyter Textbook, 2023.

Selected publications

References

Related Articles

Wikiwand AI