Uniform Manifold Approximation with Two-phase Optimization
Uniform Manifold Approximation with Two-phase Optimization (UMATO) is a dimensionality reduction technique, which can preserve the global as well as the local structure of high-dimensional data. Most existing dimensionality reduction algorithms focus on either of the two aspects, however, such insufficiency can lead to overlooking or misinterpreting important patterns in the data. For this aim, we propose a two-phase optimization: global optimization and local optimization. First, we obtain the global structure by selecting and optimizing the hub points. Next, we initialize and optimize other points using the nearest neighbor graph. Our experiments with one synthetic and three real world datasets show that UMATO can outperform the baseline algorithms, such as PCA, t-SNE, Isomap, UMAP, Topological Autoencoders and Anchor t-SNE, in terms of global measures and qualitative projection results.
- Python 3.6 or greater
- pandas (to read csv files)
You can try the following code to see the result:
# install requirements pip install scikit-learn numpy numba pandas # download specific (e.g., MNIST) datasets bash download.sh mnist # run UMATO python test.py --data=mnist
Training models & Generating embedding result
We will generate embedding results for each algorithm for the comparison. The algorithms we will use are the following:
We can run each method separately, or all of them at once.
# run all datasets bash run-benchmark.sh # run specific dataset (e.g., MNIST dataset) bash run-benchmark.sh mnist
This will cover PCA, t-SNE, UMAP and Topological Autoencoders. To run Anchor t-SNE, you need CUDA and GPU. Please refer to here for specification.
# see visualization cd visualization # install requirements npm install # run svelte app npm run dev
Embedding results of the Spheres dataset for each algorithm
Likewise, we compared the embedding result quantitatively. We use measures such as Distance to a measure and KL divergence between density distributions for comparison.
To print the quantitative result:
# print table result python -m evaluation.comparison --algo=all --data=spheres --measure=all
Result for the Spheres dataset
- DTM & KL divergence: Lower is better
- The winnder and runner-up is in bold.
- Maaten, L. V. D., & Hinton, G. (2008). Visualizing data using t-SNE. JMLR, 9(Nov), 2579-2605.
- McInnes, L., Healy, J., & Melville, J. (2018). Umap: Uniform manifold approximation and projection for dimension reduction. arXiv preprint arXiv:1802.03426.
- Moor, M., Horn, M., Rieck, B., & Borgwardt, K. (2020). Topological autoencoders. ICML.
- Fu, C., Zhang, Y., Cai, D., & Ren, X. (2019, July). AtSNE: Efficient and Robust Visualization on GPU through Hierarchical Optimization. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining (pp. 176-186).