EMC Question of the Week: January 8, 2018
In order to be in the "far-field" of a radiating electromagnetic source, the distance from a point to the source must be greater than
- a wavelength
- the largest dimension of the source
- both "a" and "b"
- none of the above
The best answer is "c". While there is no precisely defined boundary between the near-field and far-field of a radiating source; there are two separate criteria for being in the far-field. First, a point must be at a distance that is on the order of a wavelength or longer in order to avoid being in the reactive near-field. Second, a point must be at a distance greater than D2/4λ (where D is the largest dimension of the source) in order to ensure that it is not in the Fresnel region (i.e. where the relative phase of fields from different parts of the antenna is still a strong function of distance).
In the far-field, the electric and magnetic fields are essentially in-phase, they have a well-defined ratio, and they both decrease in amplitude inversely with the distance from the source.
Of course, as with any multiple choice question, an argument can always be made for "none of the above." That's why we rarely include it as an option.