## Existence of a non-dimensionless product

Imagine the dimensional matrix

*a*,

*b*, and

*c*are

*not all*zero. Under what conditions does there exist a product of the form

*y*=

*x*

_{1}

^{k1}⋅

*x*

_{2}

^{k1}⋅ … ⋅

*x*?

_{n}^{kn}### First condition

*y*is not dimensionless, a product of the form

*y*=

*x*

_{1}

^{k1}⋅

*x*

_{2}

^{k1}⋅ … ⋅

*x*

_{n}^{kn}*x*

_{1},

*x*

_{2}, …,

*x*has the same rank as the dimensional matrix of the variables

_{n}*y*,

*x*

_{1},

*x*

_{2}, …,

*x*.

_{n}*k*

_{1},

*k*

_{2}, …,

*k*exists such that they are a solution of the linear system of equations

_{n}*a*

_{1}

*k*

_{1}+

*a*

_{2}

*k*

_{2}+ … +

*a*=

_{n}k_{n}*a*

*b*

_{1}

*k*

_{1}+

*b*

_{2}

*k*

_{2}+ … +

*b*=

_{n}k_{n}*b*

⋮

*g*

_{1}

*k*

_{1}+

*g*

_{2}

*k*

_{2}+ … +

*g*=

_{n}k_{n}*g*

*a*,

*b*, …,

*g*are the dimensional exponents.

For the dimensional matrix of coefficients of *k*'s

*r*.

And, let *R* be the rank of the above dimensional matrix augmented by the column
*a*, *b*, …, *g*

**theorem for**tells us that for the above system of equations to be consistent the ranks must be equal,

*consistency*of a system of linear equations*r*=

*R*.

Because the system is consistent and *k _{i}*'s are a solution, the product

*y*=

*x*

_{1}

^{k1}⋅

*x*

_{2}

^{k1}⋅ … ⋅

*x*must therefore exist.∎

_{n}^{kn}*system of linear equations*and

*solution of the system*.

Consider the equation

*x*

_{1}−

*x*

_{2}+ 3

*x*

_{3}= −1.

*x*

_{1},

*x*

_{2}, and

*x*

_{3}.

Take another equation which is given by

*x*

_{1}+

*x*

_{2}+ 9

*x*

_{3}= −4.

*x*

_{1},

*x*

_{2}, and

*x*

_{3}.

Since both equations above are linear in the variables *x*_{1}, *x*_{2}, and
*x*_{3}, values of these variables that satisfy one equation must also satisfy the other equation.
Therefore, they can be lumped together as

*x*

_{1}−

*x*

_{2}+ 3

*x*

_{3}= −1

3

*x*

_{1}+

*x*

_{2}+ 9

*x*

_{3}= −4

*x*

_{1}= 1,

*x*

_{2}= 2, and

*x*

_{3}= −1 statisfy both equations.

Generalizing this, we define a **system of linear equation** as

*x*

_{1},

*x*

_{2}, …,

*x*.

_{n}**solution**is defined as

*s*

_{1},

*s*

_{2}, …,

*s*such that,

_{n}*x*

_{1}=

*s*

_{1},

*x*

_{2}=

*s*

_{2}, …,

*x*=

_{n}*s*is a solution of every equation in the system.

_{n}*t*

_{1},

*t*

_{2}, …,

*t*which also solves every equation in the system. Therefore, a

_{n}**solution set**is

#### Not all systems of linear equations have solutions

Consider the system given by*x*

_{1}+

*x*

_{2}= 4

2

*x*

_{1}+ 2

*x*

_{2}= 6.

*x*

_{1}+

*x*

_{2}= 4

*x*

_{1}+

*x*

_{2}= 3.

We then define a system to be **inconsistent** as

*no*solution.

**consistent**defined as

*at least one*solution.

^{(ibid. 5.)}is

*A*is called the

**rank**of

*A*.

*A*

**x**=

**b**is consistent if and only if

**b**is in the column space of

*A*.

*A*

**x**=

**b**

**b**is a linear combination of column vectors of matrix

*A*.

From definition^{(ibid. 5.)} we know that for the possibility of a linear combination of vectors at least one set of the scalars *x*_{1}, *x*_{2}, …, *x _{n}* must exist. It follows that if

**x**representing the scalars exist then

**b**must be a linear combination of vectors. Above shows that this is a linear combination of column vectors of

*A*.

Therefore, the system is consistent only if **b** is a linear combination of the column vectors and consequently **b** is in the column space.∎

*A*

**x**=

**b**is consistent if and only if the rank of its augmented matrix is equal to the rank of

*A*.

*A*

**x**=

**b**such that

*A*is a

*m*×

*n*matrix,

**x**a

*n*× 1 vector and

**b**a

*m*× 1 column vector. [

*A*

**b**] is the augmented matrix.

From the lemma^{(ibid. 5)}

*Nonzero row vectors in a row-echelon form of a matrix A form a basis for the row space of*

*A*.

*A*and consequently the rank of matrix

*A*.

From the lemma

*A*

**x**=

**b**is consistent if and only if

**b**is in the column space of

*A*

**b**between matrices

*A*and [

*A*

**b**] must be in the column space of

*A*. Therefore, the number of nonzero vectors following row operations on the

*A*and [

*A*

**b**] will be equal.

Therefore, for a consistent system *A***x** = **b**, matrices *A* and [*A* **b**] will have equal ranks.∎

### Second condition

*y*=

*ƒ*(

*x*

_{1},

*x*

_{2}, …,

*x*) is a dimensionally homogeneous equation, and if

_{n}*y*is not dimensionless, there exists a product of powers of the

*x*'s that has the same dimension as

*y*.

**Hypothesis:**

*y*=

*ƒ*(

*x*

_{1},

*x*

_{2}, …,

*x*) is dimensionally homogeneous and

_{n}*y*=

*x*

_{1}

^{k1}⋅

*x*

_{2}

^{k1}⋅ … ⋅

*x*does not exist.

_{n}^{kn}
Consider the dimensional matrix of the variables *y*, *x*_{1}, *x*_{2}, …,
*x _{n}* involving three fundamental units of dimensions [

*M*], [

*L*], and [

*T*] given below

*a*,

*b*, and

*c*are not all zero and rank of this dimensional matrix is

*R*.

The hypothesis tells us that a product of the form *y* =
*x*_{1}^{k1} ⋅ *x*_{2}^{k1}
⋅ … ⋅ *x _{n}^{kn}*
does not exist.
If

*r*is the rank of the dimensional matrix of the independent variable

*x*

_{1},

*x*

_{2}, …,

*x*and we already know that the rank of the dimensional matrix of the dependent and independent variables is

_{n}*R*then, according to the

**theorem for the first condition**

*r*≠

*R*.

*R*. Thus,

*r*<

*R*.

*R*= 3 Therefore, its determinant

Using Cramer's rule for cofactor expansion

Δ =

*aC*

_{11}+

*bC*

_{21}+

*cC*

_{31}≠ 0.

*i*= 1 or

*i*= 2 the determinant is automatically equal to zero. For instance, for

*i*= 1

Δ =

*a*

_{1}(

*b*

_{1}

*c*

_{2}−

*b*

_{2}

*c*

_{1}) +

*b*

_{1}(

*a*

_{2}

*c*

_{1}−

*a*

_{1}

*c*

_{2}) +

*c*

_{1}(

*a*

_{1}

*b*

_{2}−

*a*

_{2}

*b*

_{1})

=

*a*

_{1}

*b*

_{1}

*c*

_{2}−

*a*

_{1}

*b*

_{2}

*c*

_{1}+

*a*

_{2}

*b*

_{1}

*c*

_{1}−

*a*

_{1}

*b*

_{1}

*c*

_{2}+

*a*

_{1}

*b*

_{2}

*c*

_{1}−

*a*

_{2}

*b*

_{1}

*c*

_{1}

= 0.

*i*= 2

Δ =

*a*

_{2}(

*b*

_{1}

*c*

_{2}−

*b*

_{2}

*c*

_{1}) +

*b*

_{2}(

*a*

_{2}

*c*

_{1}−

*a*

_{1}

*c*

_{2}) +

*c*

_{2}(

*a*

_{1}

*b*

_{2}−

*a*

_{2}

*b*

_{1})

=

*a*

_{2}

*b*

_{1}

*c*

_{2}−

*a*

_{2}

*b*

_{2}

*c*

_{1}+

*a*

_{2}

*b*

_{2}

*c*

_{1}−

*a*

_{1}

*b*

_{2}

*c*

_{2}+

*a*

_{1}

*b*

_{2}

*c*

_{2}−

*a*

_{2}

*b*

_{1}

*c*

_{2}

= 0.

*i*≥ 3 the determinant is not automatically equal to zero. For

*i*= 3

Δ =

*a*

_{3}(

*b*

_{1}

*c*

_{2}−

*b*

_{2}

*c*

_{1}) +

*b*

_{3}(

*a*

_{2}

*c*

_{1}−

*a*

_{1}

*c*

_{2}) +

*c*

_{3}(

*a*

_{1}

*b*

_{2}−

*a*

_{2}

*b*

_{1})

=

*a*

_{3}

*b*

_{1}

*c*

_{2}−

*a*

_{3}

*b*

_{2}

*c*

_{1}+

*a*

_{2}

*b*

_{3}

*c*

_{1}−

*a*

_{1}

*b*

_{3}

*c*

_{2}+

*a*

_{1}

*b*

_{2}

*c*

_{3}−

*a*

_{2}

*b*

_{1}

*c*

_{3}

*R*= 3

*r*<

*R*. Therefore,

*i*≥ 3 the determinants are all equal to zero.

Let,

*C*

_{11}→

*α*,

*C*

_{21}→

*β*,

*C*

_{31}→

*γ*

*aα*+

*bβ*+

*cγ*≠ 0,

*a*+

_{i}α*b*+

_{i}β*c*= 0, for all

_{i}γ*i*= 1, 2, …,

*n*.

Since, our hypothesis claim that *y* =
*ƒ* (*x*_{1}, *x*_{2}, …, *x _{n}*)
is dimensionally homogeneous then,

*K*⋅

*ƒ*(

*x*

_{1},

*x*

_{2}, …,

*x*) =

_{n}*ƒ*(

*K*

_{1}

*x*

_{1},

*K*

_{2}

*x*

_{2}, …,

*K*

_{n}*x*)

_{n}*K*=

*A*⋅

^{a}*B*⋅

^{b}*C*

^{c}*K*=

_{i}*A*⋅

^{ai}*B*⋅

^{bi}*C*for all

^{ci}*i*= 1, 2, …,

*n*.

If *G* is some arbitrary positive constant such that

*A*=

*G*,

^{α}*B*=

*G*, and

^{β}*C*=

*G*.

^{γ}*K*'s we get,

*K*=

*A*⋅

^{a}*B*⋅

^{b}*C*

^{c}=

*G*⋅

^{αa}*G*⋅

^{βb}*G*

^{γc}=

*G*

^{(aα + bβ + cγ)}

*K*=

_{i}*G*

^{(aiα + biβ + ciγ)}

=

*G*

^{0}= 1, for all

*i*= 1, 2, …,

*n*.

*K*= 1 and

_{i}*K*=

*G*

^{(aα + bβ + cγ)}is an arbitrary constant

*K*⋅

*ƒ*(

*x*

_{1},

*x*

_{2}, …,

*x*) =

_{n}*ƒ*(

*K*

_{1}

*x*

_{1},

*K*

_{2}

*x*

_{2}, …,

*K*

_{n}*x*)

_{n}*K y*=

*ƒ*(

*x*

_{1},

*x*

_{2}, …,

*x*).

_{n}*arbitrary*positive constant multiplied by

*y*will be returned by the operator

*ƒ*working on the independent variables. In other words, for this equation there is

*no correspondence*from the independent variables

*x*

_{1},

*x*

_{2}, …,

*x*to the dependent variable

_{n}*y*. That is,

*ƒ*(

*x*

_{1},

*x*

_{2}, …,

*x*)

_{n}*cannot be a function*. This is contradictory to the hypothesis that

*y*=

*ƒ*(

*x*

_{1},

*x*

_{2}, …,

*x*) is dimensionally homogeneous and hence by definition

_{n}*y*is a function.

We can therefore state that for a rank of dimensional matrix of three if
*y* = *ƒ* (*x*_{1}, *x*_{2}, …, *x _{n}*)
is dimensionally homogeneous, then there exist a dimensionless product.

Following the above steps one can prove the above statement for other ranks.∎

*Next:*

Buckingham's Theorem (p:8) ➽