You can expand your matrix grammar into an ADT with perfect sharing with a little bit of trickery:
{-# LANGUAGE DeriveFunctor, DeriveFoldable, DeriveTraversable #-}
import Data.Map
import Data.Foldable
import Data.Functor
import Data.Traversable
-- | Type synonym for non-terminal symbols
type NonTerminal = String
-- | Data type for the right hand side of a production
data RHS a = DownStep NonTerminal NonTerminal NonTerminal NonTerminal | Terminal a
deriving (Eq,Ord,Show,Read,Functor, Foldable, Traversable)
data G a = G NonTerminal (Map NonTerminal (RHS a))
deriving (Eq,Ord,Show,Read,Functor)
data M a = Q (M a) (M a) (M a) (M a) | T a
deriving (Functor, Foldable, Traversable)
tabulate :: G a -> M a
tabulate (G s pm) = loeb (expand <$> pm) ! s where
expand (DownStep a11 a12 a21 a22) m = Q (m!a11) (m!a12) (m!a21) (m!a22)
expand (Terminal a) _ = T a
loeb :: Functor f => f (f b -> b) -> f b
loeb x = xs where xs = fmap ($xs) x
Here I generalized your grammars to allow for any data type, not just Int, and tabulate
will take the grammar and expand it by folding it in upon itself using loeb
.
loeb
is described in an article by Dan Piponi
The resulting expansion as an ADT physically takes no more memory than the original grammar -- in fact it takes a fair bit less, because it doesn't need the extra log-factor for the Map spine, and it doesn't need to store the strings at all.
Unlike the naive expansion, using loeb
lets me 'tie the knot' and share the thunks for all occurrences of the same non-terminal.
If you want to dip more into the theory of all of this, we can see that RHS
could be turned into a base functor:
data RHS t nt = Q nt nt nt nt | L t
and then my M type is just the fixed point of that Functor
.
M a ~ Mu (RHS a)
while G a
would consist of a chosen string and a map from strings to (RHS String a)
.
We can then expand G
into M
by lookup up the entry in a map of expanded strings lazily.
This is sort of the dual of what is done in the data-reify
package, which can take such a base functor, and something like M
and recover the moral equivalent of your G
from it. They use a different type for the non-terminal names, which is basically just an Int
.
data Graph e = Graph [(Unique, e Unique)] Unique
and provide a combinator
reifyGraph :: MuRef s => s -> IO (Graph (DeRef s))
which can be used with an appropriate instance on the above data types to get a graph (MatrixGrammar) out of an arbitrary matrix. It won't do deduplication of identical but separately stored quadrants, but it'll recover all of the sharing that is present in the original graph.
I use the null coalescing operator all of the time. I like the conciseness of it.
I find this operator to be similar in nature to the ternary operator (A ? B : C). It takes a little practice before the reading of it is second nature, but once you're used to it I feel readability improves over the longhand versions.
Also, the situation you describe is only one scenario where the operator is useful. It's also handy to replace constructs like this:
if (value != null)
{
return value;
}
else
{
return otherValue;
}
or
return value != null ? value : otherValue;
with
return value ?? otherValue;
Best Answer
int
is used for almost all integer variables in .NET although often a smaller type would be enough. Also, unsigned types are almost never used although they could be.Some reasons:
+
or<
for example). The rules are not obvious. I'm an experienced developer and I could not tell you the full set of rules. I do not need to know.int
is fast on all common architectures. Smaller types often result in conversions which can be slower.int
everywhere.byte
would suggest binary data for example. (See comment by Flater.)It's a useful convention to use
int
.