Font Size: a A A

Riemannian Optimization Algorithms For The Individual Difference Scaling Model In Multidimensional Scaling Analysis

Posted on:2024-03-24Degree:MasterType:Thesis
Country:ChinaCandidate:Y X ZhangFull Text:PDF
GTID:2530307157984639Subject:Mathematics
Abstract/Summary:PDF Full Text Request
The method of multidimensional scaling analysis is a valuable tool for analyzing multidimensional data by depicting the relationships of similarity and dissimilarity between observed objects in a low-dimensional space,based on the distance between points.It is a crucial technique for big data analysis and has broad applications in various fields,including economics,taxonomy,biochemistry,psychology,and more.This dissertation focuses on investigating a specific type of multivariate scaling analysis known as the individual difference scaling model(INDSCAL),which can be described as follows:given m symmetric matrices Si(i=1....m)of order n,find(Q,D1,...,Dm)so that it satisfies:where St(n,r):={Q∈Rn×r|QTQ=Ir} is usually denoted as a Stiefel manifold,D(r)m=D(r)×…×D(r)represents a linear manifold composed of all r×r diagonal matrices.The INDSCAL model is a well-known tool in the realms of stoichiometry and signal processing.This dissertation applies the principles of product manifolds to develop multiple Riemannian optimization algorithms.The study thoroughly examines the convergence properties of these algorithms and conducts various numerical experiments and analyses to evaluate their effectiveness.The dissertation comprises five chapters,which focus on studying different optimization methods based on the Riemannian framework for solving various problem models.The first chapter provides an overview of the INDSCAL model,Riemannian optimization,and product manifold matrix optimization.Chapter 2 delves into the geometric properties of the product manifold and derives detailed formulas for the Riemannian gradient and Riemannian Hessian.In Chapter 3,the Riemannian gradient descent method for the INDSCAL model is examined.The negative Riemannian gradient direction is employed as the search direction,and an appropriate step size is determined using line search.The proposed algorithm’s global convergence is analyzed.Chapter 4 presents a class of Riemann nonlinear conjugate gradient methods for solving problem models.The chapter combines the Zhang-Hager technique with the classical Armijo monotonic line search technique to propose a new class of nonmonotonic search criteria.A class of Riemannian nonlinear conjugate gradient algorithm is proposed based on the line search,and the algorithm’s global convergence is analyzed in detail.Chapter 5 focuses on a Riemannian inexact Newton method for solving problem models.The proposed method combines the Riemannian line search technique with the Riemannian Newton algorithm to ensure convergence to the local minimum.The chapter uses the CG method to solve the Riemannian Newton equation imprecisely to reduce algorithmic complexity.The global and quadratic convergence characteristics of the imprecise Riemannian-Newton algorithm are analyzed,and the sufficient and necessary conditions for the objective function Riemanian Hessian to satisfy the positive qualitative condition are given.
Keywords/Search Tags:Multidimensional scaling analysis, Individual difference scaling, Product manifold, Riemannian optimization, Riemannian inexact Newton method
PDF Full Text Request
Related items