Showing posts from June, 2022

Manifold Learning Primer

Almost all of my research experience up to the present has been related to manifold learning, as are many of the papers I'm looking forward to breaking down on this blog. So this post represents a quick, self-contained primer on manifold learning, including the theory, practice, and why you would care about it in the robotics world. An example of the manifold learning process, on the classic "Swiss Roll" dataset, using ISOMAP. The Curse of Dimensionality Broadly speaking, there is no precise definition for the "Curse of Dimensionality". It's a catchall term used by machine learning practitioners to refer to the variety of issues that can be caused by high dimensional data. In this section, I'll cover a variety of these issues, and discuss why the dimensionality of the problem specifically causes the problems. One of the clearest examples of the curse of dimensionality is the intractability of many sampling-based probabilistic algorithms in high dimension