If M is a square matrix with zero determinant, then the rows must be linearly dependent and the columns also must be linearly dependent.
We can take small examples to see how above fact works.
consider $\begin{bmatrix} 1 &2 \\ 2 & 4 \end{bmatrix}$
Here row2 is a linear combination of row1, and also you can see, columns are also dependent.
Now I take an example where columns are dependent,
$\begin{bmatrix} 3 &9 \\ 7 & 21 \end{bmatrix}$
But , row 2 is also a linear combination of Row1 as $7R_1-3R_2 \rightarrow R_2$ will result in $R_2$ being zero.
The matrix with zero determinant cannot have an inverse.
$MX=0$ has a non-trivial solution means for $X \not= 0$(the zero vector), $MX=0$ and yes this is always possible in a matrix with zero determinant because the columns will always be a linear combination of one another.
Why a matrix with linear rows and columns produces determinant 0?
if You decompose your matrix A(singular) into $LDU$ form where L is the lower triangular form, D is the diagonal form and U is the upper triangular form,your D matrix, which holds n pivots would look like this
$\begin{bmatrix} p_1& & & & & \\ & p_2 & & & & \\ & & &. & & \\ & & & &0 & \\ & & & & &.. \\ & & & & & p_n \end{bmatrix}$
one of the pivots would be zero because row elimination would cause so while you are generating U.
Now, the determinant of this D=$p_1.p_2.....p_k....0...p_n=0$
and hence, the determinant of the matrix A turns out to be 0.