Sign Up to like & get
recommendations!
0
Published in 2017 at "Acta Mathematica Sinica, English Series"
DOI: 10.1007/s10114-017-4543-x
Abstract: In this paper, we will show that every sub-Riemannian manifold is the Gromov–Hausdorff limit of a sequence of Riemannian manifolds.
read more here.
Keywords:
gromov hausdorff;
hausdorff limit;
manifold gromov;
every sub ... See more keywords
Sign Up to like & get
recommendations!
0
Published in 2018 at "Frontiers of Mathematics in China"
DOI: 10.1007/s11464-020-0823-3
Abstract: We prove that there do not exist quasi-isometric embeddings of connected nonabelian nilpotent Lie groups equipped with left invariant Riemannian metrics into a metric measure space satisfying the curvature-dimension condition RCD (0, N ) with…
read more here.
Keywords:
nilpotent lie;
non embedding;
lie groups;
embedding theorems ... See more keywords
Sign Up to like & get
recommendations!
2
Published in 2023 at "Nonlinearity"
DOI: 10.1088/1361-6544/accdae
Abstract: We consider Lagrangians that are defined only on the horizontal distribution of a sub-riemannian manifold. The associated Hamiltonian is neither strictly convex nor coercive. We prove a result on homogenization of the Hamilton–Jacobi equation following…
read more here.
Keywords:
homogenization sub;
riemannian lagrangians;
sub riemannian;
result ... See more keywords
Sign Up to like & get
recommendations!
0
Published in 2023 at "Axioms"
DOI: 10.3390/axioms12040329
Abstract: On any strictly pseudoconvex CR manifold M, of CR dimension n, equipped with a positively oriented contact form θ, we consider natural ϵ-contractions, i.e., contractions gϵM of the Levi form Gθ, such that the norm…
read more here.
Keywords:
immersion;
geometry;
sub riemannian;
riemannian geometry ... See more keywords
Sign Up to like & get
recommendations!
2
Published in 2022 at "Journal of Mathematical Imaging and Vision"
DOI: 10.48550/arxiv.2210.00935
Abstract: Group equivariant convolutional neural networks (G-CNNs) have been successfully applied in geometric deep learning. Typically, G-CNNs have the advantage over CNNs that they do not waste network capacity on training symmetries that should have been…
read more here.
Keywords:
sub riemannian;
network;
pde cnns;
cnns ... See more keywords