Just as a datapoint: I have never written any code in APL, J, K, Q, etc., and I think & = min is perfectly defensible. If I had to switch regularly between K and (say) C or Python where & is bitwise (also perfectly defensible, of course) I'd probably find it irksome and make mistakes from time to time, but I don't think it's an unreasonable choice.
(But: I am a pure mathematician by background, and the idea that logical AND is a special case of MIN is pretty familiar to us. The thing whose only values are TRUE and FALSE is a particular Boolean algebra. A Boolean algebra is a particular kind of lattice, and lattices have MIN and MAX operations, though we usually call them MEET and JOIN instead. I would expect & = min to be less comfortable to people with different backgrounds.)
(But: I am a pure mathematician by background, and the idea that logical AND is a special case of MIN is pretty familiar to us. The thing whose only values are TRUE and FALSE is a particular Boolean algebra. A Boolean algebra is a particular kind of lattice, and lattices have MIN and MAX operations, though we usually call them MEET and JOIN instead. I would expect & = min to be less comfortable to people with different backgrounds.)
[EDITED to fix an inconsequential typo.]