'_' as an argument in function definition

Let’s be a bit more precise here. The statement I’m questioning is this:

This suggests - at least to me - that, compared to a regular (unused) function argument, _ is somehow special in that it gives extra “you cannot use it” guarantees to the compiler. And if this is what is meant here, I tend to disagree and would like to see an example showing this different optimization potential. To be clear, I would claim that the optimization for _ (cannot be used) and a regular unused function argument (won’t be used) is just the same.

I don’t see how the thread you’ve linked gives any more insights into this comparison. The only “new” thing it points out is that you can’t use _ in conjuction with keyword arguments (which is clearly a difference to a regular argument). However, this is a bug and also doesn’t say anything about optimization potential.

But perhaps I’m just misunderstanding the statement in the first place?