Typical laser diode emits from 2x5um source beam with gaussian cross section distributed over the 20deg x 5deg angle. Now, if a flat light wave would hit a pinhole of 2x5um size the diffraction would generate side bands at +-17deg,+-36deg (for 2um case - see Fraunhofer approximation) which would carry more energy then 0 order. This is however not happening in case of a light leaving the laser diode. Can someone explain why ? -al
OK, I have my theory if no one offers better explanation. The case of diode illustrates the difference between classical and quantum approach. For diffraction to exist there must be interference: the wave function of photons exiting at left edge of an opening must be entangled with photons leaving at the right edge. This happens for a flat wave but is not happening to photons generated in cavity that whole size is 5um. -al