Computer Science – Data Structures and Algorithms
Scientific paper
2010-01-11
Computer Science
Data Structures and Algorithms
Scientific paper
We derive algorithms for higher order derivative computation of the rectangular $QR$ and eigenvalue decomposition of symmetric matrices with distinct eigenvalues in the forward and reverse mode of algorithmic differentiation (AD) using univariate Taylor propagation of matrices (UTPM). Linear algebra functions are regarded as elementary functions and not as algorithms. The presented algorithms are implemented in the BSD licensed AD tool \texttt{ALGOPY}. Numerical tests show that the UTPM algorithms derived in this paper produce results close to machine precision accuracy. The theory developed in this paper is applied to compute the gradient of an objective function motivated from optimum experimental design: $\nabla_x \Phi(C(J(F(x,y))))$, where $\Phi = \{\lambda_1 : \lambda_1 C\}$, $C = (J^T J)^{-1}$, $J = \frac{\dd F}{\dd y}$ and $F = F(x,y)$.
Lehmann Lutz
Walter Sebastian F.
No associations
LandOfFree
Algorithmic Differentiation of Linear Algebra Functions with Application in Optimum Experimental Design (Extended Version) does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Algorithmic Differentiation of Linear Algebra Functions with Application in Optimum Experimental Design (Extended Version), we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Algorithmic Differentiation of Linear Algebra Functions with Application in Optimum Experimental Design (Extended Version) will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-719125