Infinite dimensional version of a simple fact on certain singular matrices
For the first question, the answer is not necessarily.
Very rough idea: The rank-nullity theorem doesn't always hold on infinite dimensional spaces.
Rough idea: Let the operator $A$ be defined on $L^2(M)$ be a injective mapping such that its range does not include the constant function. More precisely, since $M$ is compact we can enumerate its eigenvalues (of the Laplacian) increasing with multiplicity as $\lambda_i$, with $\lambda_0 = 0$ corresponding to the constants. Now let $\psi:\overline{\mathbb{N}}\to \overline{\mathbb{N}}$ such that $\psi$ is injective and such that the range of $\psi$ does not contain $0$. Then defining $A$ as the map that sends the $i$th eigenspace to $\psi(i)$th eigenspace will provide a counterexample.
Realization: In practice to guarantee smoothness it is easier to not keep $A$ an isometry. Take $M = \mathbb{S}^1$ for simplicity. Let
$$ \phi_-(x) := \sum_{k < 0} 2^{-|k|} e^{ik x} $$
The series is absolutely convergent and in fact defines a $C^\infty$ function. Similarly we define
$$ \phi_+(x) := \sum_{k \geq 0} 2^{-|k|} e^{ikx} $$
Define your function $g$ by
$$ g(x,y) = \phi_-(x-y) + e^{ix} \phi_+(x-y) $$
It is easy to check that $\int_{0}^{2\pi} g(x,y) ~\mathrm{d}x = 0$ for any fixed $y$. But the operator $f(x) \mapsto \int g(x,y) f(y) ~\mathrm{d}y$ has no nontrivial kernel.
Willie Wong answered the general case, and I'd like to give a counterexample for the symplectic case:
Let $M=S^2$, the standard 2-sphere embedded in $\mathbb{R}^3$ as the unit sphere, with the standard symplectic form corresponding to the Euclidean metric on $\mathbb{R}^3$. Take on $(M \times M, \Omega = \omega \oplus - \omega)$ the functions \begin{align} f(z,w) &= z_3\cdot F(w), \\ g(z,w) &= z_3 \cdot G(w) \end{align}
both of which are in $L$. Now, $\{ f, g \} = \Omega(\mathrm{sgrad}(g), \mathrm{sgrad}(f)) =: \Omega'(\mathrm{d}f, \mathrm{d}g)$. Then,
\begin{align} \{f,g\} &= (\omega' \oplus (-\omega')) \left( F(w)\mathrm{d} z_3 + z_3 \mathrm{d} F(w), G(w)\mathrm{d} z_3 + z_3 \mathrm{d} G(w) \right) \\ &= F(w)G(w) \omega'(\mathrm{d}z_3,\mathrm{d}z_3) - z_3^2 \omega'(\mathrm{d} F(w),\mathrm{d} G(w)) \\ &= z_3^2 \{G,F\}(w) =:h(z,w) \end{align}
Now, unless $\{G,F\} \equiv 0$, $h$ will not be in $L$.