Spark 2.3.0 netty version issue: NoSuchMethod io.netty.buffer.PooledByteBufAllocator.metric()
It seems like you use an "too old" netty 4 version. Maybe you have multiple on your classpath ? It should be not problem to have netty 4.x and 3.x on the classpath.
I would like to add some more details to the answer for ease of work, just run mvn dependency:tree -Dverbose -Dincludes=io.netty:netty-all
it will return all the dependencies using io.netty
and its version. In my case the culprit was Hive Jdbc 2.1.0 which has netty-all of version lower than the version used by spark 2.3.1 so the classpath omits to load the spark's netty as it was already loaded from hive-jdbc.
So the fix is to exclude the dependencies from the Hive-Jdbc in pom.xml