B.S.T.J. Briefs: Solving Nonlinear Network Equations Using Optimization Techniques

01 November 1969

New Image

A class of nonlinear equations arising in transistor network analysis, as well as in other areas, has the form 1i(x{) + X) aaXi -- bi = 0 i-i or in matrix notation F(x) + Ax - b = 0, (2) where the nonlinearities /,(·) are continuously differentiable, strictly monotone increasing functions. Results by Willson1 and Sandberg and Willson2,3 on nonlinear networks have included broad conditions for the existence and uniqueness of a solution to equation (2). However, convergent computational algorithms for finding the solution have been given only for restricted subclasses of the class of equations that have unique solutions. 1,2,4 ' 5 These subclasses are characterized by a variety of restrictions on the matrix A and on the type of nonlinearities. In this brief we show that a single convergent algorithm exists for solving these equations under conditions virtually as broad as the known existence and uniqueness conditions. Peripherally, we obtain under these conditions a conceptually simple proof of the existence of a solution. The approach is to use the old technique (probably due to Cauchy) of converting a root-finding problem to a minimization problem. Let r(x) 4 F(x) + ^ x - b, and define the scalar valued "potential" function Q(x) ^ rTBi (4) (3) n i = 1, 2, · · · , n (1)