If the coefficients $a_i,b_i,c_i$ are not constrained to be non-negative, $a_2$ can be made as large as possible.
Consider $b_1 = b_2 = b_3 = \frac{500}{3}$. Let $a_2 =M$ be any positive number. $a_1 = \frac{5}{3}M$. Also, let $c_3 = -1$. Then, $c_1 = c_2 = \frac{18+5M}{8M}$ will satisfy all the equations. If $M$ is sufficiently large$(>\frac{3}{8}), a_3<0$. So, the product of all terms will be positive.
So, $a_2$ can be made as large as wanted. In other words, it is unbounded.
If you assume that all the $a_i,b_i,c_i$ are non-negative, the problem is straight-forward.
Define new variables $x_1 = a_1b_1c_1,x_2 = a_2b_2c_2,x_3 = a_3b_3c_3$ in the first relation. Then, solve the simple LP that arises.
Try any online solver or matlab or mathematica for a sloution. You need not know how to solve a LP to get a solution.
I will explain this looking at a much simpler example, that is something in the 2-dimensional case. Say we have the following equations :
\begin{equation}
\begin{aligned}
2x + 3y & =5&\text{ (1) } \\
x + 3y &= 4 & \text{ (2) }
\end{aligned}
\end{equation}
This system can be represented as follows:
$$\begin{pmatrix} 2 & 3 \\ 1 & 3 \end{pmatrix} \begin{pmatrix}x \\ y \end{pmatrix} = \begin{pmatrix} 5 \\ 4 \end{pmatrix} $$
When doing row reduction, I am allowed to do the following operations :
(1) Interchanging two rows
(2) Multiplying a row by a non-zero scalar.
(3) Adding a multiple of one row to another row
All these operations on the matrix translate to the operations we are familiar with when solving a system of linear equations. For example , subtracting equation $2$ from $1$ will result in the equation $x = 1$. On the matrix this means subtracting row $2$ from row $1$ on both sides or on the augmented matrix, which gives
$$\begin{pmatrix} 1 & 0 \\ 1 & 3 \end{pmatrix}\begin{pmatrix}x \\ y \end{pmatrix} = \begin{pmatrix} 1 \\ 4 \end{pmatrix}$$ To simplify further, we can subtract row $1$ from $2$ and it follows
$$\begin{pmatrix} 1 & 0 \\ 0 & 3 \end{pmatrix}\begin{pmatrix}x \\ y \end{pmatrix} = \begin{pmatrix} 1 \\ 3 \end{pmatrix}$$
Why are we doing this ? Matrices became more than just a tool for solving linear equations. They became algebraic objects themselves, with their many properties. Read A.CALEY, A memoir on the theory of matrices. Sorry, I digress.
You can also do column operations but then the matrices have to be different
$$\begin{pmatrix} x & y \end{pmatrix}\begin{pmatrix} 2 &1 \\ 3 & 3 \end{pmatrix} = \begin{pmatrix} 5 & 4 \end{pmatrix}$$ Why don't we represent it this way ? You tell me. I didn't answer your question directly , but I think with the right motivation you will find your way.
Now coming back to linear independence, say we have the vectors $u_1 = \begin{pmatrix} 2 \\ 0 \end{pmatrix} $ and $u_2= \begin{pmatrix} 1 \\ 2 \end{pmatrix} $. As you mentioned, the vectors are linearly independent if the system of equations has only a trivial solution, that is, $xu_1 + yu_1 = 0 $ if $x=y=0$ which means $$\begin{pmatrix} 2x \\ 0 \end{pmatrix} + \begin{pmatrix} y \\ 2y \end{pmatrix} = \begin{pmatrix} 2x +y \\ 2y \end{pmatrix} = \begin{pmatrix} 2x +y \\ 0x + 2y \end{pmatrix} = \begin{pmatrix} 2 & 1 \\ 0 & 2 \end{pmatrix}\begin{pmatrix} x \\ y \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \end{pmatrix} $$ if $$ \begin{pmatrix} x \\ y \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \end{pmatrix} $$ Now the problem of finding whether a set of vectors are linearly independent has been reduced to a problem of finding a solution to a system of linear equations. It is to be noted that $$ \begin{pmatrix} 2 & 1 \\ 0 & 2 \end{pmatrix} = \begin{pmatrix} u_1, u_2 \end{pmatrix}$$ It is just a representation which is convenient.
Best Answer
If you're willing to posit that the determinant is linear, here's something for the $2 \times 2$ case: consider $\left(\begin{smallmatrix} a & b \\ 1 & 0 \end{smallmatrix}\right)$. Rowwise, that gives us a parallelogram with vertices $(a,b)$, $(1,0)$, and $(a+1,b)$. (And the origin, of course, but I won't bother writing that.) That's pretty clearly a parallelogram with base 1 and height $b$.
After transposing, $\left(\begin{smallmatrix} a & 1 \\ b & 0 \end{smallmatrix}\right)$ gives a parallelogram with vertices $(a,1)$, $(b,0)$, and $(a+b,1)$; now the height is 1 and the base is $b$.
Geometrically, you can slide the top of one of those until you get a rectangle, reflect about $y=x$, and slide the top again (all area-preserving transformations) to turn one into the other.
Scaling the $(1,0)$ row/column obviously just scales the resulting parallelograms. The $\left(\begin{smallmatrix} a & b \\ 0 & 1 \end{smallmatrix}\right)$ case works out the same. So if the linearity of the determinant is okay, you've got the $2 \times 2$ case.
For $3 \times 3$, my guess is that it suffices to do $\left(\begin{smallmatrix} a & b & c \\ 1 & 0 & 0 \\ 0 & 1 & 0 \end{smallmatrix}\right)$.
But this is still a bit unsatisfying. I'm guessing the OP wants a really concrete operation that turns one parallelepiped into the other so that the equality of volumes is forcefully obvious.