What exactly does linear dependence and linear independence imply?
Intuitively vectors being linearly independent means they represent independent directions in your vector spaces, while linearly dependent vectors means they don't. So for example if you have a set of vector $\{x_1, ..., x_5\}$ and you can walk some distance in the $x_1$ direction, then a difference distance in $x_2$, then again in the direction of $x_3$. If in the end you are back where you started then the vectors are linearly dependent (notice that I did not use all the vectors).
This is the intuition behind the notion and you can make it into a definition because in the above example if we start at $0$ then we walk $a_i$ in the $x_i$ direction, then the above paragraph says that $a_1x_1+a_2x_2+a_3x_3=0$. (This is how you should think of linear combinations, as directions to go given by your vectors.)
Finally I will say that you should memorize the definitions. I've taught linear algebra to students that it is their first proof-based math class, and many students don't realize how important knowing the PRECISE definition is. Definitions are crucial and changing one single word can completely change the meaning. So my advice when just starting out is that you should make flash cards of ALL definitions in your book and memorize them. Then once you know them exactly look at the examples after the definition in the book and see how the examples fit the definition.
The vectors are dependent ('they depend on one another') if there is some relation among them (in addition to the one with all 0 present for any collection of vectors). So, dependent means there is some relation other than all 0.
Differently: independent means if you want a linear combination of the vectors to sum to the 0 vector, you need to assure that each part of the coombination independently is 0; thus each coordinate in the solution $0$.
A broader perspective on linear dependence is the theory of relations in group theory. Roughly speaking, a relation is some equation satisfied by the elements of a group, e.g. $(ab)^{-1}=b^{-1}a^{-1}$; relations basically amount to declaring how group elements depend on each other. One useful convenience is that relations can always be put into the form "$\rm blah=identity~element$" by simply inverting one side over to the other, e.g. $ab=c\Leftrightarrow abc^{-1}=e$.
Abelian groups (generally, modules) have additive group operations, so a relation would look like an equation $2a+b=3c$ or equivalently $2a+b-3c=0$. In particular a vector space is a module over a field, so instead of just integers we can have any field scalars involved in our equations with vectors. Ultimately, a linear dependency is where vectors satisfy some relationship with each other.
Conversely, a set of vectors is linearly independent if they satisfy no linearity equation other than the obvious, trivial one involving only zeros (this case is uninteresting because it applies universally and so essentially says nothing of value). So e.g. $2a+b=3c$ is impossible if $\{a,b,c\}$ is L.I.