Matlab Routines
Computes the differentiation of a function handle with respect to specified variables at provided points. It makes use of the built-in automatic differentiation (AD) framework of Matlab's Deep Learning Toolbox.
Example: Plotting a bivariate function and their derivatives
x = 0:0.01:0.1; y = 0:0.01:0.2; [xx,yy] = meshgrid(x,y);
f = @(x,y) sin(2*pi*x.*y)+ y.^2; figure; surf(xx,yy,f(xx,yy));
dfdx = automaticdifferentiation(f,xx,1,yy,0); figure; surf(xx,yy,dfdx);
d2fdx2 = automaticdifferentiation(f,xx,2,yy,0); figure; surf(xx,yy,d2fdx2);
d2fdxdy = automaticdifferentiation(f,xx,1,yy,1); figure; surf(xx,yy,d2fdxdy);
It allows multidimensional function handles and multivariate higher order derivatives. Compatible with R2021a and later releases. Requires Matlab Deep Learning Toolbox.
Download it and review it at the Mathworks File Exchange.
Product of multidimensional Matlab arrays following Einstein summation convention, where repeated indices sum over. Supports multiple outer products, inner products, singleton dimensions and pages.
Example: R[jzgi] = A[gxki] * B[kjg]
Outer indices: i,j
Inner indices: k
Page indices: g
Singleton indices: x,z
A = rand(5,1,4,8); % A_gxki
B = rand(4,9,5 ); % B_kjg
R = tensorproduct('jzgi',A,'gxki',B,'kjg'); % R_jzgi
size(R) % [9,1,5,8]
It is entirely written in Matlab, and the core computation corresponds to a matrix-matrix product with no for loops.
Download it and review it at the Mathworks File Exchange.