There may have been a wire in the wrong place or something. I did have it spread across three breadboards. I sketched everything out in a schematic, then pulled all the wires out and started over, following the schematic. Then things worked! Here's the schematic I used. The LED only lights up once the D input to the first flip-flop has gone HIGH and then LOW, which is how it should work.
Bisg is an INPUT signal not an INOUT despite the fact that it has other uses (driving segment G on the top level).
To see this; reflect that NOTHING in your component actually drives BISG.
So your component is regarded as driving it with an undefined initial level; that is, 'U', and the combination of that 'U' with any external driving level is - as ISIM says - 'U'.
You have 2 options to fix it:
(1) make BISG an INPUT since you aren't driving it. This is the simplest and in this case correct.
(2) Drive BISG in your component with 'Z' - that is, remove the 'U' output from the component. 'Z' means "high impedance : i.e. don't fight whatever external source (your testbench) is driving the pin. This is typically done to allow 2-way communication on a signal. The two ends must agree (somehow) which end is allowed to drive the signal and the other end must drive 'Z'.
In your case, adding the line
BISG <= 'Z';
(after the "begin") to your component BCT should resolve the problem. (Yes, you are modifying the file labelled "do not modify", and the modification WILL disappear when you next compile the schematic to VHDL).
Then re-run the simulation and all should be well.
I do not know if there's a way to implement solution 2 in the schematic - and I don't care : these days the schematic approach is an utter waste of time.
Your whole schematic comes down to (the original entity, and)
architecture HDL of BCT is
Bisg <= 'Z';
Bnisc <= not Bisg;
sf <= not Bisg and not A;
saisdise <= Bisg or not A;
sb <= '1';
And simplifies further if
Bisg is an input...
Which do you think is faster to create, and easier to read?
EDIT : if you need an output to drive segment G, there's nothing wrong with a separate port G, driven directly from input B, as
G <= B; or a wire on the schematic.
There were some hobbyist experiments (in the early '80s, I believe) with decoding variable speed digital data with the intent of being able to distribute code to accompany magazine articles by printing it as bar-codes. The reader (person, not machine) could then scan it into their machine with a hand-held scanner. It was assumed impractical (until shown otherwise) due one's inability to hand-scan at a uniform speed.
The solution ended up being for the decoder program to initially collect enough white-black to black-white transition times to discover the mean wide-bar and narrow-bar times, assign '1's and '0's, respectively, to the collected data, and continue decoding the incoming stream while simultaneously updating the wide-bar and narrow-bar mean times to account for changes in the wand speed over the bars.
The same technique was applied to decoding hand-sent Morse off the air, with a simple circuit going high and low with the receiver's audio output, fed to a similar algorithm.
Your hardware decoder would need to be similarly adaptive.
As an aside, an interesting issue came up when the words 'T5' or '6E' appeared frequently in the decoded text. Operators naturally developed keying habits on common words, and would key (and understand) f/ex, the word 'the',
Dah - dit dit dit dit - dit
which the decoding algorithm, doing its best with slightly uneven element spacing and a result string that didn't match any known Morse character, got one of the above digraphs.
You might find some of the articles in early issues of Byte Magazine at the library.