Multivariate Quadratic Regression
You can do multi-variate quadratic regression in the usual way. Let's label the row (and column) indices of the design matrix $A$, and the row index of the value vector $b$, by index $s(\{p_1, p_2, p_3, \cdots\})$ which pertains to the coefficient of $x_i^{p_1}x_2^{p_2}\cdots$. For example, the row labeled $s(\{ 1, 0, 2\})$ will be the row pertaining to the coefficient of $x_1x_3^2$.
Then the elements of $A$ are calculated as $$ A_{s(\{p_1, p_2, p_3, \cdots\}),s(\{q_1, q_2, q_3, \cdots\})} = \sum x_1^{p_1+q_1} x_2^{p_2+q_2} x_3^{p_3+q_3} \cdots $$ and the elements of $b$ are $$ b_{s(\{p_1, p_2, p_3, \cdots\})} = \sum y\,x_1^{p_1} x_2^{p_2} x_3^{p_3} \cdots $$ where of course all the sums are taken over the set of data points.
For example, for a 2-variable quadratic fit $y = a + bu + cv + du^2 + e uv + fv^2$ you need to solve $$ \pmatrix{N &\sum u_i &\sum v_i & \sum u_i^2 & \sum u_iv_i & \sum v_i^2 \\ \sum u_i & \sum u_i^2 & \sum u_i v_i & \sum u_i^3 & \sum u_i^2v_i & \sum u_i v_i^2 \\ \sum v_i & \sum u_iv_i & \sum v_i^2 & \sum u_i^2v_i & \sum u_iv_i^2 & \sum v_i^3 \\ \sum u_i^2 & \sum u_i^3 & \sum u_i^2 v_i & \sum u_i^4 & \sum u_i^3v_i & \sum u_i^2 v_i^2 \\ \sum u_iv_i & \sum u_i^2v_i & \sum u_i v_i^2 & \sum u_i^3v_i & \sum u_i^2v_i^2 & \sum u_i v_i^3 \\ \sum v_i^2 & \sum u_iv_i^2 & \sum v_i^3 & \sum u_i^2v_i^2 & \sum u_iv_i^3 & \sum v_i^4 } \pmatrix{a\\b\\c\\d\\e\\f} =\pmatrix{\sum y_i \\ \sum y_i u_i \\ \sum y_iv_i \\ \sum y_iu_i^2\\ \sum y_iu_iv_i \\ \sum y_iv_i^2} $$