You might have noticed by now that we've already been using literals a
lot in our examples--numeric literals and boolean literals. Why didn't we
have to quote them to keep Scheme from trying to evaluate them like other
expressions? Because Scheme has a special rule, which is that the
value of a number or boolean is that number or boolean. For these
data types, the result of attempting to evaluate
it is the same as what you started with. So the value of
4, and the value of
#f. (This also works
for a few other types, such as characters and character strings.)
Scheme lets you type in the text representation of a value as an
expression, and by convention the value of that expression is the
value you typed the printed representation of. Such an expression is
called self-evaluating, because it is evaluated to itself.
What's the deep meaning of this rule? There isn't any. It's just to
keep you from having to type a lot of quotes to use simple literals.
Notice that that means that you can quote a number or boolean if you
want, and it doesn't make any difference. The expression
means "literally the number 0," but since Scheme defines the value
of a number to be itself, the value of plain
Likewise, the value of
(quote #f is the same as
#f---they're all pointers to
the false object. You can write a string literal
"foo". In either case, the value of the expression is a pointer
to a string object with the character sequence
Minor warning: don't add extra quotes inside expressions that
are already quoted.
'(foo 10 baz) is not the same
'('foo '10 'baz). One quote for a whole literal expression
is enough, and extra quotes inside quotes do something that will
seem surprising until you understand how quoting really works.
Expression evaluation in Scheme is simple, for the most part, but you must remember the rules for the special forms (which don't always evaluate their arguments) and self-evaluation. Later, I'll show how an interpreter implements self-evaluation by analyzing expressions before evaluating them. Still later, I'll show how a compiler can do the same work at compile time, so that using literals doesn't cost any evaluation overhead at run time.