Coding a priority encoder

digital-logicencoderencodingoptimization

Let's say I have a 25bit priority encoder. So each input pin from \$i_{24}\$ to \$i_0\$ has its distinct priority in our encoder. That means that if more than one inputs are active, the encoder will act as if only the input with the highest priority is active, and return a 5-bit vector, \$out_{4..0}\$ (according of course to ceil(log2(25))=5)

Ok, I guess that intro was not needed, everyone knows how a priority encoder works. My question is:—-> If we want to change the priority of the inputs, and code a completely arbitrary priority for each input, how many bits is needed to contain the entire coding information?

So my line of thought was this: First input has 25 possible priorities, so we need 5 bits for it's value. Up until we have 16 inputs left, we need 5 bits for each input, so that's 9 inputs coded with 5 bits. Then we need 4 bits for next 8 inputs, 3 bits for next 4 inputs, 2 bits for next 2 inputs, 1bit for next 1 input, and zero bits for the last remaining input. That totals: \$9\cdot 5 + 8\cdot 4 + 4\cdot 3 + 2\cdot 2 + 1\cdot 1 + 1\cdot 0 = 94\$ bits. Is this correct? Do we need 94 bits to completely code priorities for a 25-input priory encoder?

Best Answer

One approach would be for each of the 25 inputs to either show 0 if deselected, or it's own 5-bit priority value P(i) if selected. Throw all the intermediate values into a sorting network that returns the maximum priority shown in a particular cycle. If you then have a comparator for each input to see if it's show priority equals the maximum as a single bit per input, then use a classic priority encoder technique (e.g. bitscan) on those bits.

Total storage for all priority bits is 25*5 as @Dave_Tweed mentions.