Left and Right most Derivation

programming-languagestheory

So i understand the semantics of derivations as far as Backus Naur Form goes. One thing I cannot find in any text book or the various lecturers' notes that are on-line is this.

When would a right most derivation be beneficial over a left most derivation? Would a compiler or generator always use either left, or right, or would it use both?

So here's an example:

Suppose we had the following language sentence: A = B + C * A

Example Language Grammar Permitting the Above:

<assign>  <id> = <expr> 
<id>  A | B | C
<expr>  <expr> + <term>
        |<term>
<term> <term> * <factor> 
       |<factor>
<factor>  ( expr )
        | <id>

Left Most Derivation:

<assign>   => <id> = <expr>
              => A = <expr> + <term>
              => A = <term> + <term>  
              => A = <factor> + <term> 
              => A = <id> + <term> 
              => A = B + <term> 
              => A = B + <term> * <factor>
              => A = B + <factor> * <factor>
              => A = B + <id> * <factor>
              => A = B + C * < term>
              => A = B + C * <factor>
              => A = B + C * <id>
              => A = B + C * A

Right Most Derivation:

<assign>  => <id> = <expr>
      => <id> = <expr> + <term>
      => <id> = <expr> + <term>   * <factor>
      => <id> = <expr> + <term> * <id>
      => <id> = <expr> + <term> * A
      => <id> = <expr> + <factor> * A
      => <id> = <expr> + <id> * A
      => <id> = <expr> + C * A
      => <id> = <term> + C * A
      => <id> = <id> + C * A
      => <id> = B + C * A
      =>  A = B + C * C  

Best Answer

The way it was explained in my compiler class, well over thirty years ago, is that a leftmost derivation scheme works best with a top-down (e.g. recursive descent) parser, while a rightmost derivation scheme works best with a bottom-up (e.g., SLR(1), LALR(1), YACC et al) parser. Using a leftmost derivation scheme with a bottom-up parser requires the parser to "stack up" the entire sentence, then process it from the right end, rather than recognizing short phrases at a time. Ditto the opposite pairing.

My Green Dragon Book and Red Dragon Book are currently in storage, or I'd go looking to see if they have examples or better explanations.


After thinking some more about it, the above discussion is about left recursive and right recursive grammars, but I think the effect is the same. At the time, I found the result very satisfying, in that it said there wasn't one "best" way to do a grammar, but rather it depended on how your parser worked as well.

Related Topic