Desense is the degradation in sensitivity due to noise sources, typically which are generated by the same device the radio is in. An example will make this clear.
Suppose we are looking at a certain frequency (let's say UMTS Band V, channel 9162) and the sensitivity of the receiver is -110 dBm. For illustration, we will assume we are talking about a smartphone, with memory chips, an LCD display, a camera, etc. This means that a reliable data link can be maintained when the receive power is -110 dBm and higher. If the received power is lower (-111 dBm for instance), the bit-error-rate (BER) will not be acceptable.
You should read the page on Total Isotropic Sensitivity (TIS). TIS is basically the sensitivity of the receiver-antenna system, integrated over all possible angles.
Now, let's say we connect an antenna to the receiver, and the antenna efficiency is -3 dB. What do you think the Total Isotropic Sensitivity should be? You would expect the measured TIS (average sensitivity) to be -110 dBm - (-3 dB) = -107 dBm.
Let's say you measure a TIS of -107 dBm when no other processes on your phone are running (so the camera is off, the screen (display) is off, no background applications are running). To see how the receiver is affected by other electronics, let's say you turn on a memory intensive application. This application is having the memory send signals back and forth to the CPU during your measurement. You again run the sensitivity measurement, and this time you measure TIS = -97 dBm.
What happened? You just lost 10 dB of sensitivity. This 10 dB loss is known as desense. How did this happen? Well, the electric signals between the CPU and the memory are operating at a relatively low frequency (maybe a few hundred MHz), but a harmonic of these signals will be at the same frequency as where we are measuring sensitivity. The electric signals to the memory are on transmission lines, but even transmission lines act as (very poor) antennas. And, we now have a very clean path to the receiver: that is the smartphone's own antenna, the job of which is to collect as much energy as possible. Hence, the antenna receives energy, even if it is the onboard noise source we don't want it to receive. And because the sensitivity of the receiver is so low (-100 dBm means 10^-10 mW sensitivity), the small power from the transmission lines to the memory will be noticed. This effectively increases the noise level - and since the sensitivity is a function of the Signal to Noise Ratio (SNR), we have a loss in sensitivity. This is illustrated in Figure 1:
Figure 1. Illustration of Noise Power Collected By the Receiver
Sources of Desense
The desense discussed above pertained to the memory. This is a common source of WIFI noise in computers. Desense can come from pretty much anything your computer, smartphone or whatever does:
In general, any process or feature on a device can be checked for desense by ensuring it is off during the initial sensitivity (TIS) measurement, and then re-running the sensitivity measurement with the process on.
Mitigation of Desense
Desense mitigation is a must-do for modern devices. Desense can be reduced by finding how the noise energy is being radiated, and then doing everything possible to reduce that. For instance, in memory, there are pins that go up to the memory chip, and these act as small inefficient antennas. By shielding the pins or reducing their length, the radiation efficiency (and hence the radiated power) of the lines can be reduced.
Another technique is to try to keep lines symmetric. That is, if a signal is travelling one way on a wire, the return path should be near the forward path and a mirror image of the first line. This will help to cancel the radiated fields from the current flowing on the wire.
If you kill the antenna (by reducing the antenna efficiency), you will reduce the noise power to the receiver. However, since you are interested in maintaining a wireless link (and therefore want to maximize the SNR), killing the antenna will reduce your signal as much as the noise so nothing gained.
In general, the process of desense mitigation is (1) trying to find the noise source (which often is a harmonic of a lower frequency noise source), and (2) finding the mechanism for radiation or coupling to the antenna (which directs the noise power to the receiver). If one or both of these links can be reduced, the desense will be improved.
Acceptable levels of desense are application specific. Sometimes 10 dB of desense is good, other times, 3 dB is considered too much.