blob: 8c25a047c7e096a154786a9f6a44a1f960d583a5 [file] [log] [blame]
Chris Lattnerc0b42e92007-10-23 06:27:55 +00001<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN"
2 "http://www.w3.org/TR/html4/strict.dtd">
3
4<html>
5<head>
6 <title>Kaleidoscope: Adding JIT and Optimizer Support</title>
7 <meta http-equiv="Content-Type" content="text/html; charset=utf-8">
8 <meta name="author" content="Chris Lattner">
9 <link rel="stylesheet" href="../llvm.css" type="text/css">
10</head>
11
12<body>
13
14<div class="doc_title">Kaleidoscope: Adding JIT and Optimizer Support</div>
15
16<div class="doc_author">
17 <p>Written by <a href="mailto:sabre@nondot.org">Chris Lattner</a></p>
18</div>
19
20<!-- *********************************************************************** -->
21<div class="doc_section"><a name="intro">Part 4 Introduction</a></div>
22<!-- *********************************************************************** -->
23
24<div class="doc_text">
25
26<p>Welcome to part 4 of the "<a href="index.html">Implementing a language with
Chris Lattner118749e2007-10-25 06:23:36 +000027LLVM</a>" tutorial. Parts 1-3 described the implementation of a simple language
28and included support for generating LLVM IR. This chapter describes two new
29techniques: adding optimizer support to your language, and adding JIT compiler
30support. This shows how to get nice efficient code for your language.</p>
Chris Lattnerc0b42e92007-10-23 06:27:55 +000031
32</div>
33
34<!-- *********************************************************************** -->
Chris Lattner118749e2007-10-25 06:23:36 +000035<div class="doc_section"><a name="trivialconstfold">Trivial Constant
36Folding</a></div>
Chris Lattnerc0b42e92007-10-23 06:27:55 +000037<!-- *********************************************************************** -->
38
39<div class="doc_text">
40
41<p>
Chris Lattner118749e2007-10-25 06:23:36 +000042Our demonstration for Chapter 3 is elegant and easy to extend. Unfortunately,
43it does not produce wonderful code. For example, when compiling simple code,
44we don't get obvious optimizations:</p>
Chris Lattnerc0b42e92007-10-23 06:27:55 +000045
46<div class="doc_code">
47<pre>
Chris Lattner118749e2007-10-25 06:23:36 +000048ready&gt; <b>def test(x) 1+2+x;</b>
49Read function definition:
50define double @test(double %x) {
51entry:
52 %addtmp = add double 1.000000e+00, 2.000000e+00
53 %addtmp1 = add double %addtmp, %x
54 ret double %addtmp1
55}
56</pre>
57</div>
58
59<p>This code is a very very literal transcription of the AST built by parsing
60our code, and as such, lacks optimizations like constant folding (we'd like to
61get "<tt>add x, 3.0</tt>" in the example above) as well as other more important
62optimizations. Constant folding in particular is a very common and very
63important optimization: so much so that many language implementors implement
64constant folding support in their AST representation.</p>
65
66<p>With LLVM, you don't need to. Since all calls to build LLVM IR go through
67the LLVM builder, it would be nice if the builder itself checked to see if there
68was a constant folding opportunity when you call it. If so, it could just do
69the constant fold and return the constant instead of creating an instruction.
70This is exactly what the <tt>LLVMFoldingBuilder</tt> class does. Lets make one
71change:
72
73<div class="doc_code">
74<pre>
75static LLVMFoldingBuilder Builder;
76</pre>
77</div>
78
79<p>All we did was switch from <tt>LLVMBuilder</tt> to
80<tt>LLVMFoldingBuilder</tt>. Though we change no other code, now all of our
81instructions are implicitly constant folded without us having to do anything
82about it. For example, our example above now compiles to:</p>
83
84<div class="doc_code">
85<pre>
86ready&gt; <b>def test(x) 1+2+x;</b>
87Read function definition:
88define double @test(double %x) {
89entry:
90 %addtmp = add double 3.000000e+00, %x
91 ret double %addtmp
92}
93</pre>
94</div>
95
96<p>Well, that was easy. :) In practice, we recommend always using
Owen Anderson6867aec2007-10-25 06:50:30 +000097<tt>LLVMFoldingBuilder</tt> when generating code like this. It has no
Chris Lattner118749e2007-10-25 06:23:36 +000098"syntactic overhead" for its use (you don't have to uglify your compiler with
99constant checks everywhere) and it can dramatically reduce the amount of
100LLVM IR that is generated in some cases (particular for languages with a macro
101preprocessor or that use a lot of constants).</p>
102
103<p>On the other hand, the <tt>LLVMFoldingBuilder</tt> is limited by the fact
104that it does all of its analysis inline with the code as it is built. If you
105take a slightly more complex example:</p>
106
107<div class="doc_code">
108<pre>
109ready&gt; <b>def test(x) (1+2+x)*(x+(1+2));</b>
110ready> Read function definition:
111define double @test(double %x) {
112entry:
113 %addtmp = add double 3.000000e+00, %x
114 %addtmp1 = add double %x, 3.000000e+00
115 %multmp = mul double %addtmp, %addtmp1
116 ret double %multmp
117}
118</pre>
119</div>
120
121<p>In this case, the LHS and RHS of the multiplication are the same value. We'd
122really like to see this generate "<tt>tmp = x+3; result = tmp*tmp;</tt>" instead
123of computing "<tt>x*3</tt>" twice.</p>
124
125<p>Unfortunately, no amount of local analysis will be able to detect and correct
126this. This requires two transformations: reassociation of expressions (to
127make the add's lexically identical) and Common Subexpression Elimination (CSE)
128to delete the redundant add instruction. Fortunately, LLVM provides a broad
129range of optimizations that you can use, in the form of "passes".</p>
130
131</div>
132
133<!-- *********************************************************************** -->
134<div class="doc_section"><a name="optimizerpasses">LLVM Optimization
135 Passes</a></div>
136<!-- *********************************************************************** -->
137
138<div class="doc_text">
139
140<p>LLVM provides many optimization passes which do many different sorts of
141things and have different tradeoffs. Unlike other systems, LLVM doesn't hold
142to the mistaken notion that one set of optimizations is right for all languages
143and for all situations. LLVM allows a compiler implementor to make complete
144decisions about what optimizations to use, in which order, and in what
145situation.</p>
146
147<p>As a concrete example, LLVM supports both "whole module" passes, which look
148across as large of body of code as they can (often a whole file, but if run
149at link time, this can be a substantial portion of the whole program). It also
150supports and includes "per-function" passes which just operate on a single
151function at a time, without looking at other functions. For more information
152on passes and how the get run, see the <a href="../WritingAnLLVMPass.html">How
153to Write a Pass</a> document.</p>
154
155<p>For Kaleidoscope, we are currently generating functions on the fly, one at
156a time, as the user types them in. We aren't shooting for the ultimate
157optimization experience in this setting, but we also want to catch the easy and
158quick stuff where possible. As such, we will choose to run a few per-function
159optimizations as the user types the function in. If we wanted to make a "static
160Kaleidoscope compiler", we would use exactly the code we have now, except that
161we would defer running the optimizer until the entire file has been parsed.</p>
162
163<p>In order to get per-function optimizations going, we need to set up a
164<a href="../WritingAnLLVMPass.html#passmanager">FunctionPassManager</a> to hold and
165organize the LLVM optimizations that we want to run. Once we have that, we can
166add a set of optimizations to run. The code looks like this:</p>
167
168<div class="doc_code">
169<pre>
170 ExistingModuleProvider OurModuleProvider(TheModule);
171 FunctionPassManager OurFPM(&amp;OurModuleProvider);
172
173 // Set up the optimizer pipeline. Start with registering info about how the
174 // target lays out data structures.
175 OurFPM.add(new TargetData(*TheExecutionEngine->getTargetData()));
176 // Do simple "peephole" optimizations and bit-twiddling optzns.
177 OurFPM.add(createInstructionCombiningPass());
178 // Reassociate expressions.
179 OurFPM.add(createReassociatePass());
180 // Eliminate Common SubExpressions.
181 OurFPM.add(createGVNPass());
182 // Simplify the control flow graph (deleting unreachable blocks, etc).
183 OurFPM.add(createCFGSimplificationPass());
184
185 // Set the global so the code gen can use this.
186 TheFPM = &amp;OurFPM;
187
188 // Run the main "interpreter loop" now.
189 MainLoop();
190</pre>
191</div>
192
193<p>This code defines two objects, a <tt>ExistingModuleProvider</tt> and a
194<tt>FunctionPassManager</tt>. The former is basically a wrapper around our
195<tt>Module</tt> that the PassManager requires. It provides certain flexibility
196that we're not going to take advantage of here, so I won't dive into what it is
197all about.</p>
198
199<p>The meat of the matter is the definition of the "<tt>OurFPM</tt>". It
200requires a pointer to the <tt>Module</tt> (through the <tt>ModuleProvider</tt>)
201to construct itself. Once it is set up, we use a series of "add" calls to add
202a bunch of LLVM passes. The first pass is basically boilerplate, it adds a pass
203so that later optimizations know how the data structures in the program are
204layed out. The "<tt>TheExecutionEngine</tt>" variable is related to the JIT,
205which we will get to in the next section.</p>
206
207<p>In this case, we choose to add 4 optimization passes. The passes we chose
208here are a pretty standard set of "cleanup" optimizations that are useful for
209a wide variety of code. I won't delve into what they do, but believe that they
210are a good starting place.</p>
211
212<p>Once the passmanager, is set up, we need to make use of it. We do this by
213running it after our newly created function is constructed (in
214<tt>FunctionAST::Codegen</tt>), but before it is returned to the client:</p>
215
216<div class="doc_code">
217<pre>
218 if (Value *RetVal = Body->Codegen()) {
219 // Finish off the function.
220 Builder.CreateRet(RetVal);
221
222 // Validate the generated code, checking for consistency.
223 verifyFunction(*TheFunction);
224
225 // Optimize the function.
226 TheFPM->run(*TheFunction);
227
228 return TheFunction;
229 }
230</pre>
231</div>
232
233<p>As you can see, this is pretty straight-forward. The
234<tt>FunctionPassManager</tt> optimizes and updates the LLVM Function* in place,
235improving (hopefully) its body. With this in place, we can try our test above
236again:</p>
237
238<div class="doc_code">
239<pre>
240ready&gt; <b>def test(x) (1+2+x)*(x+(1+2));</b>
241ready> Read function definition:
242define double @test(double %x) {
243entry:
244 %addtmp = add double %x, 3.000000e+00
245 %multmp = mul double %addtmp, %addtmp
246 ret double %multmp
247}
248</pre>
249</div>
250
251<p>As expected, we now get our nicely optimized code, saving a floating point
252add from the program.</p>
253
254<p>LLVM provides a wide variety of optimizations that can be used in certain
Chris Lattner72714232007-10-25 17:52:39 +0000255circumstances. Some <a href="../Passes.html">documentation about the various
256passes</a> is available, but it isn't very complete. Another good source of
257ideas is to look at the passes that <tt>llvm-gcc</tt> or
Chris Lattner118749e2007-10-25 06:23:36 +0000258<tt>llvm-ld</tt> run to get started. The "<tt>opt</tt>" tool allows you to
259experiment with passes from the command line, so you can see if they do
260anything.</p>
261
262<p>Now that we have reasonable code coming out of our front-end, lets talk about
263executing it!</p>
264
265</div>
266
267<!-- *********************************************************************** -->
268<div class="doc_section"><a name="jit">Adding a JIT Compiler</a></div>
269<!-- *********************************************************************** -->
270
271<div class="doc_text">
272
273<p>Once the code is available in LLVM IR form a wide variety of tools can be
274applied to it. For example, you can run optimizations on it (as we did above),
275you can dump it out in textual or binary forms, you can compile the code to an
276assembly file (.s) for some target, or you can JIT compile it. The nice thing
277about the LLVM IR representation is that it is the common currency between many
278different parts of the compiler.
279</p>
280
281<p>In this chapter, we'll add JIT compiler support to our interpreter. The
282basic idea that we want for Kaleidoscope is to have the user enter function
283bodies as they do now, but immediately evaluate the top-level expressions they
284type in. For example, if they type in "1 + 2;", we should evaluate and print
285out 3. If they define a function, they should be able to call it from the
286command line.</p>
287
288<p>In order to do this, we first declare and initialize the JIT. This is done
289by adding a global variable and a call in <tt>main</tt>:</p>
290
291<div class="doc_code">
292<pre>
293static ExecutionEngine *TheExecutionEngine;
294...
295int main() {
296 ..
297 // Create the JIT.
298 TheExecutionEngine = ExecutionEngine::create(TheModule);
299 ..
300}
301</pre>
302</div>
303
304<p>This creates an abstract "Execution Engine" which can be either a JIT
305compiler or the LLVM interpreter. LLVM will automatically pick a JIT compiler
306for you if one is available for your platform, otherwise it will fall back to
307the interpreter.</p>
308
309<p>Once the <tt>ExecutionEngine</tt> is created, the JIT is ready to be used.
310There are a variety of APIs that are useful, but the most simple one is the
311"<tt>getPointerToFunction(F)</tt>" method. This method JIT compiles the
312specified LLVM Function and returns a function pointer to the generated machine
313code. In our case, this means that we can change the code that parses a
314top-level expression to look like this:</p>
315
316<div class="doc_code">
317<pre>
318static void HandleTopLevelExpression() {
319 // Evaluate a top level expression into an anonymous function.
320 if (FunctionAST *F = ParseTopLevelExpr()) {
321 if (Function *LF = F-&gt;Codegen()) {
322 LF->dump(); // Dump the function for exposition purposes.
323
324 // JIT the function, returning a function pointer.
325 void *FPtr = TheExecutionEngine-&gt;getPointerToFunction(LF);
326
327 // Cast it to the right type (takes no arguments, returns a double) so we
328 // can call it as a native function.
329 double (*FP)() = (double (*)())FPtr;
330 fprintf(stderr, "Evaluated to %f\n", FP());
331 }
332</pre>
333</div>
334
335<p>Recall that we compile top-level expressions into a self-contained LLVM
336function that takes no arguments and returns the computed double. Because the
337LLVM JIT compiler matches the native platform ABI, this means that you can just
338cast the result pointer to a function pointer of that type and call it directly.
339As such, there is no difference between JIT compiled code and native machine
340code that is statically linked into your application.</p>
341
342<p>With just these two changes, lets see how Kaleidoscope works now!</p>
343
344<div class="doc_code">
345<pre>
346ready&gt; <b>4+5;</b>
347define double @""() {
348entry:
349 ret double 9.000000e+00
350}
351
352<em>Evaluated to 9.000000</em>
353</pre>
354</div>
355
356<p>Well this looks like it is basically working. The dump of the function
357shows the "no argument function that always returns double" that we synthesize
358for each top level expression that is typed it. This demonstrates very basic
359functionality, but can we do more?</p>
360
361<div class="doc_code">
362<pre>
Chris Lattner2e89f3a2007-10-31 07:30:39 +0000363ready&gt; <b>def testfunc(x y) x + y*2; </b>
Chris Lattner118749e2007-10-25 06:23:36 +0000364Read function definition:
365define double @testfunc(double %x, double %y) {
366entry:
367 %multmp = mul double %y, 2.000000e+00
368 %addtmp = add double %multmp, %x
369 ret double %addtmp
370}
371
372ready&gt; <b>testfunc(4, 10);</b>
373define double @""() {
374entry:
375 %calltmp = call double @testfunc( double 4.000000e+00, double 1.000000e+01 )
376 ret double %calltmp
377}
378
379<em>Evaluated to 24.000000</em>
380</pre>
381</div>
382
383<p>This illustrates that we can now call user code, but it is a bit subtle what
384is going on here. Note that we only invoke the JIT on the anonymous functions
385that <em>calls testfunc</em>, but we never invoked it on <em>testfunc
386itself</em>.</p>
387
388<p>What actually happened here is that the anonymous function is
389JIT'd when requested. When the Kaleidoscope app calls through the function
390pointer that is returned, the anonymous function starts executing. It ends up
391making the call for the "testfunc" function, and ends up in a stub that invokes
392the JIT, lazily, on testfunc. Once the JIT finishes lazily compiling testfunc,
393it returns and the code reexecutes the call.</p>
394
395<p>In summary, the JIT will lazily JIT code on the fly as it is needed. The
396JIT provides a number of other more advanced interfaces for things like freeing
397allocated machine code, rejit'ing functions to update them, etc. However, even
398with this simple code, we get some surprisingly powerful capabilities - check
399this out (I removed the dump of the anonymous functions, you should get the idea
400by now :) :</p>
401
402<div class="doc_code">
403<pre>
404ready&gt; <b>extern sin(x);</b>
405Read extern:
406declare double @sin(double)
407
408ready&gt; <b>extern cos(x);</b>
409Read extern:
410declare double @cos(double)
411
412ready&gt; <b>sin(1.0);</b>
413<em>Evaluated to 0.841471</em>
Chris Lattner72714232007-10-25 17:52:39 +0000414
Chris Lattner118749e2007-10-25 06:23:36 +0000415ready&gt; <b>def foo(x) sin(x)*sin(x) + cos(x)*cos(x);</b>
416Read function definition:
417define double @foo(double %x) {
418entry:
419 %calltmp = call double @sin( double %x )
420 %multmp = mul double %calltmp, %calltmp
421 %calltmp2 = call double @cos( double %x )
422 %multmp4 = mul double %calltmp2, %calltmp2
423 %addtmp = add double %multmp, %multmp4
424 ret double %addtmp
425}
426
427ready&gt; <b>foo(4.0);</b>
428<em>Evaluated to 1.000000</em>
429</pre>
430</div>
431
432<p>Whoa, how does the JIT know about sin and cos? The answer is simple: in this
433example, the JIT started execution of a function and got to a function call. It
434realized that the function was not yet JIT compiled and invoked the standard set
435of routines to resolve the function. In this case, there is no body defined
436for the function, so the JIT ended up calling "<tt>dlsym("sin")</tt>" on itself.
437Since "<tt>sin</tt>" is defined within the JIT's address space, it simply
438patches up calls in the module to call the libm version of <tt>sin</tt>
439directly.</p>
440
441<p>The LLVM JIT provides a number of interfaces (look in the
442<tt>ExecutionEngine.h</tt> file) for controlling how unknown functions get
443resolved. It allows you to establish explicit mappings between IR objects and
444addresses (useful for LLVM global variables that you want to map to static
445tables, for example), allows you to dynamically decide on the fly based on the
446function name, and even allows you to have the JIT abort itself if any lazy
447compilation is attempted.</p>
448
Chris Lattner72714232007-10-25 17:52:39 +0000449<p>One interesting application of this is that we can now extend the language
450by writing arbitrary C++ code to implement operations. For example, if we add:
451</p>
452
453<div class="doc_code">
454<pre>
455/// putchard - putchar that takes a double and returns 0.
456extern "C"
457double putchard(double X) {
458 putchar((char)X);
459 return 0;
460}
461</pre>
462</div>
463
464<p>Now we can produce simple output to the console by using things like:
465"<tt>extern putchard(x); putchard(120);</tt>", which prints a lowercase 'x' on
466the console (120 is the ascii code for 'x'). Similar code could be used to
467implement file I/O, console input, and many other capabilities in
468Kaleidoscope.</p>
469
Chris Lattner118749e2007-10-25 06:23:36 +0000470<p>This completes the JIT and optimizer chapter of the Kaleidoscope tutorial. At
471this point, we can compile a non-Turing-complete programming language, optimize
472and JIT compile it in a user-driven way. Next up we'll look into <a
473href="LangImpl5.html">extending the language with control flow constructs</a>,
474tackling some interesting LLVM IR issues along the way.</p>
475
476</div>
477
478<!-- *********************************************************************** -->
479<div class="doc_section"><a name="code">Full Code Listing</a></div>
480<!-- *********************************************************************** -->
481
482<div class="doc_text">
483
484<p>
485Here is the complete code listing for our running example, enhanced with the
486LLVM JIT and optimizer. To build this example, use:
487</p>
488
489<div class="doc_code">
490<pre>
491 # Compile
492 g++ -g toy.cpp `llvm-config --cppflags --ldflags --libs core jit native` -O3 -o toy
493 # Run
494 ./toy
495</pre>
496</div>
497
498<p>Here is the code:</p>
499
500<div class="doc_code">
501<pre>
502#include "llvm/DerivedTypes.h"
503#include "llvm/ExecutionEngine/ExecutionEngine.h"
504#include "llvm/Module.h"
505#include "llvm/ModuleProvider.h"
506#include "llvm/PassManager.h"
507#include "llvm/Analysis/Verifier.h"
508#include "llvm/Target/TargetData.h"
509#include "llvm/Transforms/Scalar.h"
510#include "llvm/Support/LLVMBuilder.h"
511#include &lt;cstdio&gt;
512#include &lt;string&gt;
513#include &lt;map&gt;
514#include &lt;vector&gt;
515using namespace llvm;
516
517//===----------------------------------------------------------------------===//
518// Lexer
519//===----------------------------------------------------------------------===//
520
521// The lexer returns tokens [0-255] if it is an unknown character, otherwise one
522// of these for known things.
523enum Token {
524 tok_eof = -1,
525
526 // commands
527 tok_def = -2, tok_extern = -3,
528
529 // primary
530 tok_identifier = -4, tok_number = -5,
531};
532
533static std::string IdentifierStr; // Filled in if tok_identifier
534static double NumVal; // Filled in if tok_number
535
536/// gettok - Return the next token from standard input.
537static int gettok() {
538 static int LastChar = ' ';
539
540 // Skip any whitespace.
541 while (isspace(LastChar))
542 LastChar = getchar();
543
544 if (isalpha(LastChar)) { // identifier: [a-zA-Z][a-zA-Z0-9]*
545 IdentifierStr = LastChar;
546 while (isalnum((LastChar = getchar())))
547 IdentifierStr += LastChar;
548
549 if (IdentifierStr == "def") return tok_def;
550 if (IdentifierStr == "extern") return tok_extern;
551 return tok_identifier;
552 }
553
554 if (isdigit(LastChar) || LastChar == '.') { // Number: [0-9.]+
555 std::string NumStr;
556 do {
557 NumStr += LastChar;
558 LastChar = getchar();
559 } while (isdigit(LastChar) || LastChar == '.');
560
561 NumVal = strtod(NumStr.c_str(), 0);
562 return tok_number;
563 }
564
565 if (LastChar == '#') {
566 // Comment until end of line.
567 do LastChar = getchar();
568 while (LastChar != EOF &amp;&amp; LastChar != '\n' &amp; LastChar != '\r');
569
570 if (LastChar != EOF)
571 return gettok();
572 }
573
574 // Check for end of file. Don't eat the EOF.
575 if (LastChar == EOF)
576 return tok_eof;
577
578 // Otherwise, just return the character as its ascii value.
579 int ThisChar = LastChar;
580 LastChar = getchar();
581 return ThisChar;
582}
583
584//===----------------------------------------------------------------------===//
585// Abstract Syntax Tree (aka Parse Tree)
586//===----------------------------------------------------------------------===//
587
Chris Lattnerc0b42e92007-10-23 06:27:55 +0000588/// ExprAST - Base class for all expression nodes.
589class ExprAST {
590public:
591 virtual ~ExprAST() {}
592 virtual Value *Codegen() = 0;
593};
594
595/// NumberExprAST - Expression class for numeric literals like "1.0".
596class NumberExprAST : public ExprAST {
597 double Val;
598public:
Chris Lattner118749e2007-10-25 06:23:36 +0000599 NumberExprAST(double val) : Val(val) {}
Chris Lattnerc0b42e92007-10-23 06:27:55 +0000600 virtual Value *Codegen();
601};
Chris Lattner118749e2007-10-25 06:23:36 +0000602
603/// VariableExprAST - Expression class for referencing a variable, like "a".
604class VariableExprAST : public ExprAST {
605 std::string Name;
606public:
607 VariableExprAST(const std::string &amp;name) : Name(name) {}
608 virtual Value *Codegen();
609};
610
611/// BinaryExprAST - Expression class for a binary operator.
612class BinaryExprAST : public ExprAST {
613 char Op;
614 ExprAST *LHS, *RHS;
615public:
616 BinaryExprAST(char op, ExprAST *lhs, ExprAST *rhs)
617 : Op(op), LHS(lhs), RHS(rhs) {}
618 virtual Value *Codegen();
619};
620
621/// CallExprAST - Expression class for function calls.
622class CallExprAST : public ExprAST {
623 std::string Callee;
624 std::vector&lt;ExprAST*&gt; Args;
625public:
626 CallExprAST(const std::string &amp;callee, std::vector&lt;ExprAST*&gt; &amp;args)
627 : Callee(callee), Args(args) {}
628 virtual Value *Codegen();
629};
630
631/// PrototypeAST - This class represents the "prototype" for a function,
632/// which captures its argument names as well as if it is an operator.
633class PrototypeAST {
634 std::string Name;
635 std::vector&lt;std::string&gt; Args;
636public:
637 PrototypeAST(const std::string &amp;name, const std::vector&lt;std::string&gt; &amp;args)
638 : Name(name), Args(args) {}
639
640 Function *Codegen();
641};
642
643/// FunctionAST - This class represents a function definition itself.
644class FunctionAST {
645 PrototypeAST *Proto;
646 ExprAST *Body;
647public:
648 FunctionAST(PrototypeAST *proto, ExprAST *body)
649 : Proto(proto), Body(body) {}
650
651 Function *Codegen();
652};
653
654//===----------------------------------------------------------------------===//
655// Parser
656//===----------------------------------------------------------------------===//
657
658/// CurTok/getNextToken - Provide a simple token buffer. CurTok is the current
659/// token the parser it looking at. getNextToken reads another token from the
660/// lexer and updates CurTok with its results.
661static int CurTok;
662static int getNextToken() {
663 return CurTok = gettok();
664}
665
666/// BinopPrecedence - This holds the precedence for each binary operator that is
667/// defined.
668static std::map&lt;char, int&gt; BinopPrecedence;
669
670/// GetTokPrecedence - Get the precedence of the pending binary operator token.
671static int GetTokPrecedence() {
672 if (!isascii(CurTok))
673 return -1;
674
675 // Make sure it's a declared binop.
676 int TokPrec = BinopPrecedence[CurTok];
677 if (TokPrec &lt;= 0) return -1;
678 return TokPrec;
679}
680
681/// Error* - These are little helper functions for error handling.
682ExprAST *Error(const char *Str) { fprintf(stderr, "Error: %s\n", Str);return 0;}
683PrototypeAST *ErrorP(const char *Str) { Error(Str); return 0; }
684FunctionAST *ErrorF(const char *Str) { Error(Str); return 0; }
685
686static ExprAST *ParseExpression();
687
688/// identifierexpr
689/// ::= identifer
690/// ::= identifer '(' expression* ')'
691static ExprAST *ParseIdentifierExpr() {
692 std::string IdName = IdentifierStr;
693
694 getNextToken(); // eat identifer.
695
696 if (CurTok != '(') // Simple variable ref.
697 return new VariableExprAST(IdName);
698
699 // Call.
700 getNextToken(); // eat (
701 std::vector&lt;ExprAST*&gt; Args;
702 while (1) {
703 ExprAST *Arg = ParseExpression();
704 if (!Arg) return 0;
705 Args.push_back(Arg);
706
707 if (CurTok == ')') break;
708
709 if (CurTok != ',')
710 return Error("Expected ')'");
711 getNextToken();
712 }
713
714 // Eat the ')'.
715 getNextToken();
716
717 return new CallExprAST(IdName, Args);
718}
719
720/// numberexpr ::= number
721static ExprAST *ParseNumberExpr() {
722 ExprAST *Result = new NumberExprAST(NumVal);
723 getNextToken(); // consume the number
724 return Result;
725}
726
727/// parenexpr ::= '(' expression ')'
728static ExprAST *ParseParenExpr() {
729 getNextToken(); // eat (.
730 ExprAST *V = ParseExpression();
731 if (!V) return 0;
732
733 if (CurTok != ')')
734 return Error("expected ')'");
735 getNextToken(); // eat ).
736 return V;
737}
738
739/// primary
740/// ::= identifierexpr
741/// ::= numberexpr
742/// ::= parenexpr
743static ExprAST *ParsePrimary() {
744 switch (CurTok) {
745 default: return Error("unknown token when expecting an expression");
746 case tok_identifier: return ParseIdentifierExpr();
747 case tok_number: return ParseNumberExpr();
748 case '(': return ParseParenExpr();
749 }
750}
751
752/// binoprhs
753/// ::= ('+' primary)*
754static ExprAST *ParseBinOpRHS(int ExprPrec, ExprAST *LHS) {
755 // If this is a binop, find its precedence.
756 while (1) {
757 int TokPrec = GetTokPrecedence();
758
759 // If this is a binop that binds at least as tightly as the current binop,
760 // consume it, otherwise we are done.
761 if (TokPrec &lt; ExprPrec)
762 return LHS;
763
764 // Okay, we know this is a binop.
765 int BinOp = CurTok;
766 getNextToken(); // eat binop
767
768 // Parse the primary expression after the binary operator.
769 ExprAST *RHS = ParsePrimary();
770 if (!RHS) return 0;
771
772 // If BinOp binds less tightly with RHS than the operator after RHS, let
773 // the pending operator take RHS as its LHS.
774 int NextPrec = GetTokPrecedence();
775 if (TokPrec &lt; NextPrec) {
776 RHS = ParseBinOpRHS(TokPrec+1, RHS);
777 if (RHS == 0) return 0;
778 }
779
780 // Merge LHS/RHS.
781 LHS = new BinaryExprAST(BinOp, LHS, RHS);
782 }
783}
784
785/// expression
786/// ::= primary binoprhs
787///
788static ExprAST *ParseExpression() {
789 ExprAST *LHS = ParsePrimary();
790 if (!LHS) return 0;
791
792 return ParseBinOpRHS(0, LHS);
793}
794
795/// prototype
796/// ::= id '(' id* ')'
797static PrototypeAST *ParsePrototype() {
798 if (CurTok != tok_identifier)
799 return ErrorP("Expected function name in prototype");
800
801 std::string FnName = IdentifierStr;
802 getNextToken();
803
804 if (CurTok != '(')
805 return ErrorP("Expected '(' in prototype");
806
807 std::vector&lt;std::string&gt; ArgNames;
808 while (getNextToken() == tok_identifier)
809 ArgNames.push_back(IdentifierStr);
810 if (CurTok != ')')
811 return ErrorP("Expected ')' in prototype");
812
813 // success.
814 getNextToken(); // eat ')'.
815
816 return new PrototypeAST(FnName, ArgNames);
817}
818
819/// definition ::= 'def' prototype expression
820static FunctionAST *ParseDefinition() {
821 getNextToken(); // eat def.
822 PrototypeAST *Proto = ParsePrototype();
823 if (Proto == 0) return 0;
824
825 if (ExprAST *E = ParseExpression())
826 return new FunctionAST(Proto, E);
827 return 0;
828}
829
830/// toplevelexpr ::= expression
831static FunctionAST *ParseTopLevelExpr() {
832 if (ExprAST *E = ParseExpression()) {
833 // Make an anonymous proto.
834 PrototypeAST *Proto = new PrototypeAST("", std::vector&lt;std::string&gt;());
835 return new FunctionAST(Proto, E);
836 }
837 return 0;
838}
839
840/// external ::= 'extern' prototype
841static PrototypeAST *ParseExtern() {
842 getNextToken(); // eat extern.
843 return ParsePrototype();
844}
845
846//===----------------------------------------------------------------------===//
847// Code Generation
848//===----------------------------------------------------------------------===//
849
850static Module *TheModule;
851static LLVMFoldingBuilder Builder;
852static std::map&lt;std::string, Value*&gt; NamedValues;
853static FunctionPassManager *TheFPM;
854
855Value *ErrorV(const char *Str) { Error(Str); return 0; }
856
857Value *NumberExprAST::Codegen() {
858 return ConstantFP::get(Type::DoubleTy, APFloat(Val));
859}
860
861Value *VariableExprAST::Codegen() {
862 // Look this variable up in the function.
863 Value *V = NamedValues[Name];
864 return V ? V : ErrorV("Unknown variable name");
865}
866
867Value *BinaryExprAST::Codegen() {
868 Value *L = LHS-&gt;Codegen();
869 Value *R = RHS-&gt;Codegen();
870 if (L == 0 || R == 0) return 0;
871
872 switch (Op) {
873 case '+': return Builder.CreateAdd(L, R, "addtmp");
874 case '-': return Builder.CreateSub(L, R, "subtmp");
875 case '*': return Builder.CreateMul(L, R, "multmp");
876 case '&lt;':
877 L = Builder.CreateFCmpULT(L, R, "multmp");
878 // Convert bool 0/1 to double 0.0 or 1.0
879 return Builder.CreateUIToFP(L, Type::DoubleTy, "booltmp");
880 default: return ErrorV("invalid binary operator");
881 }
882}
883
884Value *CallExprAST::Codegen() {
885 // Look up the name in the global module table.
886 Function *CalleeF = TheModule-&gt;getFunction(Callee);
887 if (CalleeF == 0)
888 return ErrorV("Unknown function referenced");
889
890 // If argument mismatch error.
891 if (CalleeF-&gt;arg_size() != Args.size())
892 return ErrorV("Incorrect # arguments passed");
893
894 std::vector&lt;Value*&gt; ArgsV;
895 for (unsigned i = 0, e = Args.size(); i != e; ++i) {
896 ArgsV.push_back(Args[i]-&gt;Codegen());
897 if (ArgsV.back() == 0) return 0;
898 }
899
900 return Builder.CreateCall(CalleeF, ArgsV.begin(), ArgsV.end(), "calltmp");
901}
902
903Function *PrototypeAST::Codegen() {
904 // Make the function type: double(double,double) etc.
905 std::vector&lt;const Type*&gt; Doubles(Args.size(), Type::DoubleTy);
906 FunctionType *FT = FunctionType::get(Type::DoubleTy, Doubles, false);
907
908 Function *F = new Function(FT, Function::ExternalLinkage, Name, TheModule);
909
910 // If F conflicted, there was already something named 'Name'. If it has a
911 // body, don't allow redefinition or reextern.
912 if (F-&gt;getName() != Name) {
913 // Delete the one we just made and get the existing one.
914 F-&gt;eraseFromParent();
915 F = TheModule-&gt;getFunction(Name);
916
917 // If F already has a body, reject this.
918 if (!F-&gt;empty()) {
919 ErrorF("redefinition of function");
920 return 0;
921 }
922
923 // If F took a different number of args, reject.
924 if (F-&gt;arg_size() != Args.size()) {
925 ErrorF("redefinition of function with different # args");
926 return 0;
927 }
928 }
929
930 // Set names for all arguments.
931 unsigned Idx = 0;
932 for (Function::arg_iterator AI = F-&gt;arg_begin(); Idx != Args.size();
933 ++AI, ++Idx) {
934 AI-&gt;setName(Args[Idx]);
935
936 // Add arguments to variable symbol table.
937 NamedValues[Args[Idx]] = AI;
938 }
939
940 return F;
941}
942
943Function *FunctionAST::Codegen() {
944 NamedValues.clear();
945
946 Function *TheFunction = Proto-&gt;Codegen();
947 if (TheFunction == 0)
948 return 0;
949
950 // Create a new basic block to start insertion into.
951 BasicBlock *BB = new BasicBlock("entry", TheFunction);
952 Builder.SetInsertPoint(BB);
953
954 if (Value *RetVal = Body-&gt;Codegen()) {
955 // Finish off the function.
956 Builder.CreateRet(RetVal);
957
958 // Validate the generated code, checking for consistency.
959 verifyFunction(*TheFunction);
960
961 // Optimize the function.
962 TheFPM-&gt;run(*TheFunction);
963
964 return TheFunction;
965 }
966
967 // Error reading body, remove function.
968 TheFunction-&gt;eraseFromParent();
969 return 0;
970}
971
972//===----------------------------------------------------------------------===//
973// Top-Level parsing and JIT Driver
974//===----------------------------------------------------------------------===//
975
976static ExecutionEngine *TheExecutionEngine;
977
978static void HandleDefinition() {
979 if (FunctionAST *F = ParseDefinition()) {
980 if (Function *LF = F-&gt;Codegen()) {
981 fprintf(stderr, "Read function definition:");
982 LF-&gt;dump();
983 }
984 } else {
985 // Skip token for error recovery.
986 getNextToken();
987 }
988}
989
990static void HandleExtern() {
991 if (PrototypeAST *P = ParseExtern()) {
992 if (Function *F = P-&gt;Codegen()) {
993 fprintf(stderr, "Read extern: ");
994 F-&gt;dump();
995 }
996 } else {
997 // Skip token for error recovery.
998 getNextToken();
999 }
1000}
1001
1002static void HandleTopLevelExpression() {
1003 // Evaluate a top level expression into an anonymous function.
1004 if (FunctionAST *F = ParseTopLevelExpr()) {
1005 if (Function *LF = F-&gt;Codegen()) {
1006 // JIT the function, returning a function pointer.
1007 void *FPtr = TheExecutionEngine-&gt;getPointerToFunction(LF);
1008
1009 // Cast it to the right type (takes no arguments, returns a double) so we
1010 // can call it as a native function.
1011 double (*FP)() = (double (*)())FPtr;
1012 fprintf(stderr, "Evaluated to %f\n", FP());
1013 }
1014 } else {
1015 // Skip token for error recovery.
1016 getNextToken();
1017 }
1018}
1019
1020/// top ::= definition | external | expression | ';'
1021static void MainLoop() {
1022 while (1) {
1023 fprintf(stderr, "ready&gt; ");
1024 switch (CurTok) {
1025 case tok_eof: return;
1026 case ';': getNextToken(); break; // ignore top level semicolons.
1027 case tok_def: HandleDefinition(); break;
1028 case tok_extern: HandleExtern(); break;
1029 default: HandleTopLevelExpression(); break;
1030 }
1031 }
1032}
1033
1034
1035
1036//===----------------------------------------------------------------------===//
1037// "Library" functions that can be "extern'd" from user code.
1038//===----------------------------------------------------------------------===//
1039
1040/// putchard - putchar that takes a double and returns 0.
1041extern "C"
1042double putchard(double X) {
1043 putchar((char)X);
1044 return 0;
1045}
1046
1047//===----------------------------------------------------------------------===//
1048// Main driver code.
1049//===----------------------------------------------------------------------===//
1050
1051int main() {
1052 // Install standard binary operators.
1053 // 1 is lowest precedence.
1054 BinopPrecedence['&lt;'] = 10;
1055 BinopPrecedence['+'] = 20;
1056 BinopPrecedence['-'] = 20;
1057 BinopPrecedence['*'] = 40; // highest.
1058
1059 // Prime the first token.
1060 fprintf(stderr, "ready&gt; ");
1061 getNextToken();
1062
1063 // Make the module, which holds all the code.
1064 TheModule = new Module("my cool jit");
1065
1066 // Create the JIT.
1067 TheExecutionEngine = ExecutionEngine::create(TheModule);
1068
1069 {
1070 ExistingModuleProvider OurModuleProvider(TheModule);
1071 FunctionPassManager OurFPM(&amp;OurModuleProvider);
1072
1073 // Set up the optimizer pipeline. Start with registering info about how the
1074 // target lays out data structures.
1075 OurFPM.add(new TargetData(*TheExecutionEngine-&gt;getTargetData()));
1076 // Do simple "peephole" optimizations and bit-twiddling optzns.
1077 OurFPM.add(createInstructionCombiningPass());
1078 // Reassociate expressions.
1079 OurFPM.add(createReassociatePass());
1080 // Eliminate Common SubExpressions.
1081 OurFPM.add(createGVNPass());
1082 // Simplify the control flow graph (deleting unreachable blocks, etc).
1083 OurFPM.add(createCFGSimplificationPass());
1084
1085 // Set the global so the code gen can use this.
1086 TheFPM = &amp;OurFPM;
1087
1088 // Run the main "interpreter loop" now.
1089 MainLoop();
1090
1091 TheFPM = 0;
1092 } // Free module provider and pass manager.
1093
1094
1095 // Print out all of the generated code.
1096 TheModule-&gt;dump();
1097 return 0;
1098}
Chris Lattnerc0b42e92007-10-23 06:27:55 +00001099</pre>
1100</div>
1101
Chris Lattnerc0b42e92007-10-23 06:27:55 +00001102</div>
1103
1104<!-- *********************************************************************** -->
1105<hr>
1106<address>
1107 <a href="http://jigsaw.w3.org/css-validator/check/referer"><img
1108 src="http://jigsaw.w3.org/css-validator/images/vcss" alt="Valid CSS!"></a>
1109 <a href="http://validator.w3.org/check/referer"><img
Chris Lattner8eef4b22007-10-23 06:30:50 +00001110 src="http://www.w3.org/Icons/valid-html401" alt="Valid HTML 4.01!"></a>
Chris Lattnerc0b42e92007-10-23 06:27:55 +00001111
1112 <a href="mailto:sabre@nondot.org">Chris Lattner</a><br>
1113 <a href="http://llvm.org">The LLVM Compiler Infrastructure</a><br>
1114 Last modified: $Date: 2007-10-17 11:05:13 -0700 (Wed, 17 Oct 2007) $
1115</address>
1116</body>
1117</html>