ADCME is a package which is focused largely on providing automatic differentiation (AD) routines, whereas Turing is a general Probabilistic Programming Language (PPL) that allows you write models that takes priors + data and produce posterior distributions (i.e., in the case of BNNs, take a randomly initialized set of prior parameter distributions and transform them into posterior distributions through neural network training). Flux uses a different backend for AD called Zygote.jl, which is just a different AD library to perform backpropagation on your neural net.
I agree that the best way to start is with a combination of Flux and Turing (notice that the Turing tutorial explicitly uses Flux anyway!). These are general and actively maintained libraries that also have the ability to extend easily into other projects you may come across.
TL;DR: ADCME.jl is an AD library first, and Flux already has an AD backend that you don’t need to explicitly worry about when writing Flux code. Turing.jl allows you to turn “standard” networks into Bayesian ones. The Turing.jl tutorial you found is the way to start.