ASoC: Allow drivers to specify how many bits are significant on a DAI

Most devices accept data in formats that don't correspond directly to
their internal format. ALSA allows us to set a msbits constraint which
tells userspace about this in case it finds it useful (for example, in
order to avoid wasting effort dithering bits that will be ignored when
raising the sample size of data) so provide a mechanism for drivers to
specify the number of bits that are actually significant on a DAI and
add the appropriate constraints along with all the others.

This is done slightly awkwardly as the constraint is specified per sample
size - we loop over every possible sample size, including ones that the
device doesn't support and including ones that have fewer bits than are
actually used, but this is harmless as the upper layers do the right thing
in these cases.

Signed-off-by: Mark Brown <broonie@opensource.wolfsonmicro.com>
Acked-by: Liam Girdwood <lrg@ti.com>
diff --git a/include/sound/soc.h b/include/sound/soc.h
index 0992dff..55381fc 100644
--- a/include/sound/soc.h
+++ b/include/sound/soc.h
@@ -505,6 +505,7 @@
 	unsigned int rate_max;		/* max rate */
 	unsigned int channels_min;	/* min channels */
 	unsigned int channels_max;	/* max channels */
+	unsigned int sig_bits;		/* number of bits of content */
 };
 
 /* SoC audio ops */