The question in the title is copied from Slack.
The short answer is no, there is no public interface for doing this. The learned parameters of each atomic model are of course stored as part of the learned parameters for the composite, so you could get at these in principle, but this would be a hack.
However, you could instead create your ensemble using a learning network, which you export as a new composite model type. There is an example of just this in this Data Science Tutorial. Each atomic model amounts to a machine in the learning network, and
fitted_params(composite_machine) gives you an access point for all machines trained in the network.
After the line in the tutorial
mach = machine(one_hundred_models, X, y)
you can go on to do
julia> fit!(mach, verbosity=0); julia> machs = fitted_params(mach).machines; julia> predictions = [predict(mach, X) for mach in machs]; julia> predictions |> mean 22.532806324110677