blob: d1dd46fa2a9f9e6ac9a87b10a990dbfb2d7fbe6b [file] [log] [blame]
Lang Hames7331cc32016-05-23 20:34:19 +00001=======================================================
Lang Hames9d4ea6d2016-05-25 23:34:19 +00002Building a JIT: Starting out with KaleidoscopeJIT
Lang Hames7331cc32016-05-23 20:34:19 +00003=======================================================
4
5.. contents::
6 :local:
7
Lang Hames7331cc32016-05-23 20:34:19 +00008Chapter 1 Introduction
9======================
10
11Welcome to Chapter 1 of the "Building an ORC-based JIT in LLVM" tutorial. This
12tutorial runs through the implementation of a JIT compiler using LLVM's
13On-Request-Compilation (ORC) APIs. It begins with a simplified version of the
14KaleidoscopeJIT class used in the
Kirill Bobyreve4364832017-07-10 09:07:23 +000015`Implementing a language with LLVM <LangImpl01.html>`_ tutorials and then
Lang Hames7331cc32016-05-23 20:34:19 +000016introduces new features like optimization, lazy compilation and remote
17execution.
18
19The goal of this tutorial is to introduce you to LLVM's ORC JIT APIs, show how
20these APIs interact with other parts of LLVM, and to teach you how to recombine
21them to build a custom JIT that is suited to your use-case.
22
23The structure of the tutorial is:
24
25- Chapter #1: Investigate the simple KaleidoscopeJIT class. This will
26 introduce some of the basic concepts of the ORC JIT APIs, including the
27 idea of an ORC *Layer*.
28
29- `Chapter #2 <BuildingAJIT2.html>`_: Extend the basic KaleidoscopeJIT by adding
30 a new layer that will optimize IR and generated code.
31
32- `Chapter #3 <BuildingAJIT3.html>`_: Further extend the JIT by adding a
33 Compile-On-Demand layer to lazily compile IR.
34
35- `Chapter #4 <BuildingAJIT4.html>`_: Improve the laziness of our JIT by
36 replacing the Compile-On-Demand layer with a custom layer that uses the ORC
37 Compile Callbacks API directly to defer IR-generation until functions are
38 called.
39
40- `Chapter #5 <BuildingAJIT5.html>`_: Add process isolation by JITing code into
41 a remote process with reduced privileges using the JIT Remote APIs.
42
43To provide input for our JIT we will use the Kaleidoscope REPL from
Kirill Bobyreve4364832017-07-10 09:07:23 +000044`Chapter 7 <LangImpl07.html>`_ of the "Implementing a language in LLVM tutorial",
Lang Hames7331cc32016-05-23 20:34:19 +000045with one minor modification: We will remove the FunctionPassManager from the
46code for that chapter and replace it with optimization support in our JIT class
47in Chapter #2.
48
49Finally, a word on API generations: ORC is the 3rd generation of LLVM JIT API.
Sylvestre Ledru7d540502016-07-02 19:28:40 +000050It was preceded by MCJIT, and before that by the (now deleted) legacy JIT.
Lang Hames7331cc32016-05-23 20:34:19 +000051These tutorials don't assume any experience with these earlier APIs, but
52readers acquainted with them will see many familiar elements. Where appropriate
53we will make this connection with the earlier APIs explicit to help people who
54are transitioning from them to ORC.
55
56JIT API Basics
57==============
58
59The purpose of a JIT compiler is to compile code "on-the-fly" as it is needed,
60rather than compiling whole programs to disk ahead of time as a traditional
61compiler does. To support that aim our initial, bare-bones JIT API will be:
62
631. Handle addModule(Module &M) -- Make the given IR module available for
64 execution.
652. JITSymbol findSymbol(const std::string &Name) -- Search for pointers to
66 symbols (functions or variables) that have been added to the JIT.
673. void removeModule(Handle H) -- Remove a module from the JIT, releasing any
68 memory that had been used for the compiled code.
69
70A basic use-case for this API, executing the 'main' function from a module,
71will look like:
72
73.. code-block:: c++
74
Lang Hames59a5ad82016-05-25 22:33:25 +000075 std::unique_ptr<Module> M = buildModule();
76 JIT J;
77 Handle H = J.addModule(*M);
Lang Hamese815bf32017-08-15 19:20:10 +000078 int (*Main)(int, char*[]) = (int(*)(int, char*[]))J.getSymbolAddress("main");
Lang Hames59a5ad82016-05-25 22:33:25 +000079 int Result = Main();
80 J.removeModule(H);
Lang Hames7331cc32016-05-23 20:34:19 +000081
Lang Hamese815bf32017-08-15 19:20:10 +000082The APIs that we build in these tutorials will all be aovariations on this simple
Lang Hames7331cc32016-05-23 20:34:19 +000083theme. Behind the API we will refine the implementation of the JIT to add
84support for optimization and lazy compilation. Eventually we will extend the
85API itself to allow higher-level program representations (e.g. ASTs) to be
86added to the JIT.
87
88KaleidoscopeJIT
89===============
90
91In the previous section we described our API, now we examine a simple
92implementation of it: The KaleidoscopeJIT class [1]_ that was used in the
Kirill Bobyreve4364832017-07-10 09:07:23 +000093`Implementing a language with LLVM <LangImpl01.html>`_ tutorials. We will use
94the REPL code from `Chapter 7 <LangImpl07.html>`_ of that tutorial to supply the
Lang Hames7331cc32016-05-23 20:34:19 +000095input for our JIT: Each time the user enters an expression the REPL will add a
96new IR module containing the code for that expression to the JIT. If the
97expression is a top-level expression like '1+1' or 'sin(x)', the REPL will also
98use the findSymbol method of our JIT class find and execute the code for the
99expression, and then use the removeModule method to remove the code again
100(since there's no way to re-invoke an anonymous expression). In later chapters
101of this tutorial we'll modify the REPL to enable new interactions with our JIT
102class, but for now we will take this setup for granted and focus our attention on
103the implementation of our JIT itself.
104
105Our KaleidoscopeJIT class is defined in the KaleidoscopeJIT.h header. After the
106usual include guards and #includes [2]_, we get to the definition of our class:
107
108.. code-block:: c++
109
Lang Hames59a5ad82016-05-25 22:33:25 +0000110 #ifndef LLVM_EXECUTIONENGINE_ORC_KALEIDOSCOPEJIT_H
111 #define LLVM_EXECUTIONENGINE_ORC_KALEIDOSCOPEJIT_H
Lang Hames7331cc32016-05-23 20:34:19 +0000112
Lang Hamese815bf32017-08-15 19:20:10 +0000113 #include "llvm/ADT/STLExtras.h"
Lang Hames59a5ad82016-05-25 22:33:25 +0000114 #include "llvm/ExecutionEngine/ExecutionEngine.h"
115 #include "llvm/ExecutionEngine/RTDyldMemoryManager.h"
116 #include "llvm/ExecutionEngine/Orc/CompileUtils.h"
117 #include "llvm/ExecutionEngine/Orc/IRCompileLayer.h"
118 #include "llvm/ExecutionEngine/Orc/LambdaResolver.h"
119 #include "llvm/ExecutionEngine/Orc/ObjectLinkingLayer.h"
120 #include "llvm/IR/Mangler.h"
121 #include "llvm/Support/DynamicLibrary.h"
Lang Hamese815bf32017-08-15 19:20:10 +0000122 #include "llvm/Support/raw_ostream.h"
123 #include "llvm/Target/TargetMachine.h"
124 #include <algorithm>
125 #include <memory>
126 #include <string>
127 #include <vector>
Lang Hames7331cc32016-05-23 20:34:19 +0000128
Lang Hames59a5ad82016-05-25 22:33:25 +0000129 namespace llvm {
130 namespace orc {
Lang Hames7331cc32016-05-23 20:34:19 +0000131
Lang Hames59a5ad82016-05-25 22:33:25 +0000132 class KaleidoscopeJIT {
133 private:
Lang Hames59a5ad82016-05-25 22:33:25 +0000134 std::unique_ptr<TargetMachine> TM;
135 const DataLayout DL;
Lang Hamese815bf32017-08-15 19:20:10 +0000136 RTDyldObjectLinkingLayer ObjectLayer;
137 IRCompileLayer<decltype(ObjectLayer), SimpleCompiler> CompileLayer;
Lang Hames7331cc32016-05-23 20:34:19 +0000138
Lang Hames59a5ad82016-05-25 22:33:25 +0000139 public:
Lang Hamese815bf32017-08-15 19:20:10 +0000140 using ModuleHandle = decltype(CompileLayer)::ModuleHandleT;
Lang Hames7331cc32016-05-23 20:34:19 +0000141
Lang Hamese815bf32017-08-15 19:20:10 +0000142Our class begins with four members: A TargetMachine, TM, which will be used to
143build our LLVM compiler instance; A DataLayout, DL, which will be used for
Lang Hamesdb0551e2016-05-30 19:03:26 +0000144symbol mangling (more on that later), and two ORC *layers*: an
Lang Hamese815bf32017-08-15 19:20:10 +0000145RTDyldObjectLinkingLayer and a CompileLayer. We'll be talking more about layers
146in the next chapter, but for now you can think of them as analogous to LLVM
Sylvestre Ledru7d540502016-07-02 19:28:40 +0000147Passes: they wrap up useful JIT utilities behind an easy to compose interface.
Lang Hamese815bf32017-08-15 19:20:10 +0000148The first layer, ObjectLayer, is the foundation of our JIT: it takes in-memory
149object files produced by a compiler and links them on the fly to make them
150executable. This JIT-on-top-of-a-linker design was introduced in MCJIT, however
151the linker was hidden inside the MCJIT class. In ORC we expose the linker so
152that clients can access and configure it directly if they need to. In this
153tutorial our ObjectLayer will just be used to support the next layer in our
154stack: the CompileLayer, which will be responsible for taking LLVM IR, compiling
155it, and passing the resulting in-memory object files down to the object linking
156layer below.
Lang Hames7331cc32016-05-23 20:34:19 +0000157
Lang Hamesdb0551e2016-05-30 19:03:26 +0000158That's it for member variables, after that we have a single typedef:
Lang Hamese815bf32017-08-15 19:20:10 +0000159ModuleHandle. This is the handle type that will be returned from our JIT's
Lang Hamesdb0551e2016-05-30 19:03:26 +0000160addModule method, and can be passed to the removeModule method to remove a
161module. The IRCompileLayer class already provides a convenient handle type
Lang Hamese815bf32017-08-15 19:20:10 +0000162(IRCompileLayer::ModuleSetHandleT), so we just alias our ModuleHandle to this.
Lang Hames7331cc32016-05-23 20:34:19 +0000163
164.. code-block:: c++
165
Lang Hames59a5ad82016-05-25 22:33:25 +0000166 KaleidoscopeJIT()
167 : TM(EngineBuilder().selectTarget()), DL(TM->createDataLayout()),
Lang Hamese815bf32017-08-15 19:20:10 +0000168 ObjectLayer([]() { return std::make_shared<SectionMemoryManager>(); }),
Mehdi Aminibb6805d2017-02-11 21:26:52 +0000169 CompileLayer(ObjectLayer, SimpleCompiler(*TM)) {
Lang Hames59a5ad82016-05-25 22:33:25 +0000170 llvm::sys::DynamicLibrary::LoadLibraryPermanently(nullptr);
171 }
Lang Hames7331cc32016-05-23 20:34:19 +0000172
Lang Hames59a5ad82016-05-25 22:33:25 +0000173 TargetMachine &getTargetMachine() { return *TM; }
Lang Hames7331cc32016-05-23 20:34:19 +0000174
175Next up we have our class constructor. We begin by initializing TM using the
Lang Hamese815bf32017-08-15 19:20:10 +0000176EngineBuilder::selectTarget helper method which constructs a TargetMachine for
177the current process. Then we use our newly created TargetMachine to initialize
178DL, our DataLayout. After that we need to initialize our ObjectLayer. The
179ObjectLayer requires a function object that will build a JIT memory manager for
180each module that is added (a JIT memory manager manages memory allocations,
181memory permissions, and registration of exception handlers for JIT'd code). For
182this we use a lambda that returns a SectionMemoryManager, an off-the-shelf
183utility that provides all the basic memory management functionality required for
184this chapter. Next we initialize our CompileLayer. The Compile laye needs two
185things: (1) A reference to our object layer, and (2) a compiler instance to use
186to perform the actual compilation from IR to object files. We use the
187off-the-shelf SimpleCompiler instance for now. Finally, in the body of the
188constructor, we call the DynamicLibrary::LoadLibraryPermanently method with a
189nullptr argument. Normally the LoadLibraryPermanently method is called with the
190path of a dynamic library to load, but when passed a null pointer it will 'load'
191the host process itself, making its exported symbols available for execution.
Lang Hames7331cc32016-05-23 20:34:19 +0000192
193.. code-block:: c++
194
Lang Hamese0fc5ae2016-05-25 22:27:25 +0000195 ModuleHandle addModule(std::unique_ptr<Module> M) {
196 // Build our symbol resolver:
197 // Lambda 1: Look back into the JIT itself to find symbols that are part of
198 // the same "logical dylib".
199 // Lambda 2: Search for external symbols in the host process.
200 auto Resolver = createLambdaResolver(
201 [&](const std::string &Name) {
202 if (auto Sym = CompileLayer.findSymbol(Name, false))
Lang Hamesad4a9112016-08-01 20:49:11 +0000203 return Sym;
204 return JITSymbol(nullptr);
Lang Hamese0fc5ae2016-05-25 22:27:25 +0000205 },
Lang Hamese815bf32017-08-15 19:20:10 +0000206 [](const std::string &Name) {
Lang Hamese0fc5ae2016-05-25 22:27:25 +0000207 if (auto SymAddr =
208 RTDyldMemoryManager::getSymbolAddressInProcess(Name))
Lang Hamesad4a9112016-08-01 20:49:11 +0000209 return JITSymbol(SymAddr, JITSymbolFlags::Exported);
210 return JITSymbol(nullptr);
Lang Hamese0fc5ae2016-05-25 22:27:25 +0000211 });
Lang Hames7331cc32016-05-23 20:34:19 +0000212
Lang Hamese0fc5ae2016-05-25 22:27:25 +0000213 // Add the set to the JIT with the resolver we created above and a newly
214 // created SectionMemoryManager.
Lang Hamese815bf32017-08-15 19:20:10 +0000215 return cantFail(CompileLayer.addModule(std::move(M),
216 std::move(Resolver)));
Lang Hamese0fc5ae2016-05-25 22:27:25 +0000217 }
218
Lang Hamesdb0551e2016-05-30 19:03:26 +0000219Now we come to the first of our JIT API methods: addModule. This method is
220responsible for adding IR to the JIT and making it available for execution. In
221this initial implementation of our JIT we will make our modules "available for
Lang Hamese815bf32017-08-15 19:20:10 +0000222execution" by adding them straight to the CompileLayer, which will immediately
223compile them. In later chapters we will teach our JIT to be defer compilation
224of individual functions until they're actually called.
Lang Hamese0fc5ae2016-05-25 22:27:25 +0000225
Lang Hamese815bf32017-08-15 19:20:10 +0000226To add our module to the CompileLayer we need to supply both the module and a
227symbol resolver. The symbol resolver is responsible for supplying the JIT with
228an address for each *external symbol* in the module we are adding. External
Lang Hamesdb0551e2016-05-30 19:03:26 +0000229symbols are any symbol not defined within the module itself, including calls to
230functions outside the JIT and calls to functions defined in other modules that
Lang Hamese815bf32017-08-15 19:20:10 +0000231have already been added to the JIT. (It may seem as though modules added to the
232JIT should know about one another by default, but since we would still have to
Lang Hamesdb0551e2016-05-30 19:03:26 +0000233supply a symbol resolver for references to code outside the JIT it turns out to
Lang Hamese815bf32017-08-15 19:20:10 +0000234be easier to re-use this one mechanism for all symbol resolution.) This has the
235added benefit that the user has full control over the symbol resolution
Lang Hamesdb0551e2016-05-30 19:03:26 +0000236process. Should we search for definitions within the JIT first, then fall back
237on external definitions? Or should we prefer external definitions where
238available and only JIT code if we don't already have an available
239implementation? By using a single symbol resolution scheme we are free to choose
240whatever makes the most sense for any given use case.
Lang Hamese0fc5ae2016-05-25 22:27:25 +0000241
Lang Hamesdb0551e2016-05-30 19:03:26 +0000242Building a symbol resolver is made especially easy by the *createLambdaResolver*
Lang Hamesad4a9112016-08-01 20:49:11 +0000243function. This function takes two lambdas [3]_ and returns a JITSymbolResolver
244instance. The first lambda is used as the implementation of the resolver's
245findSymbolInLogicalDylib method, which searches for symbol definitions that
246should be thought of as being part of the same "logical" dynamic library as this
247Module. If you are familiar with static linking: this means that
248findSymbolInLogicalDylib should expose symbols with common linkage and hidden
249visibility. If all this sounds foreign you can ignore the details and just
250remember that this is the first method that the linker will use to try to find a
251symbol definition. If the findSymbolInLogicalDylib method returns a null result
252then the linker will call the second symbol resolver method, called findSymbol,
253which searches for symbols that should be thought of as external to (but
254visibile from) the module and its logical dylib. In this tutorial we will adopt
255the following simple scheme: All modules added to the JIT will behave as if they
256were linked into a single, ever-growing logical dylib. To implement this our
257first lambda (the one defining findSymbolInLogicalDylib) will just search for
258JIT'd code by calling the CompileLayer's findSymbol method. If we don't find a
259symbol in the JIT itself we'll fall back to our second lambda, which implements
Mehdi Aminibb6805d2017-02-11 21:26:52 +0000260findSymbol. This will use the RTDyldMemoryManager::getSymbolAddressInProcess
Lang Hamesad4a9112016-08-01 20:49:11 +0000261method to search for the symbol within the program itself. If we can't find a
Mehdi Aminibb6805d2017-02-11 21:26:52 +0000262symbol definition via either of these paths, the JIT will refuse to accept our
Lang Hamesad4a9112016-08-01 20:49:11 +0000263module, returning a "symbol not found" error.
Lang Hamese0fc5ae2016-05-25 22:27:25 +0000264
Mehdi Aminibb6805d2017-02-11 21:26:52 +0000265Now that we've built our symbol resolver, we're ready to add our module to the
Lang Hamese815bf32017-08-15 19:20:10 +0000266JIT. We do this by calling the CompileLayer's addModule method. The addModule
267method returns an ``Expected<CompileLayer::ModuleHandle>``, since in more
268advanced JIT configurations it could fail. In our basic configuration we know
269that it will always succeed so we use the cantFail utility to assert that no
270error occurred, and extract the handle value. Since we have already typedef'd
271our ModuleHandle type to be the same as the CompileLayer's handle type, we can
272return the unwrapped handle directly.
Lang Hames7331cc32016-05-23 20:34:19 +0000273
274.. code-block:: c++
275
276 JITSymbol findSymbol(const std::string Name) {
277 std::string MangledName;
278 raw_string_ostream MangledNameStream(MangledName);
279 Mangler::getNameWithPrefix(MangledNameStream, Name, DL);
280 return CompileLayer.findSymbol(MangledNameStream.str(), true);
281 }
282
Lang Hamese815bf32017-08-15 19:20:10 +0000283 JITTargetAddress getSymbolAddress(const std::string Name) {
284 return cantFail(findSymbol(Name).getAddress());
285 }
286
Lang Hames7331cc32016-05-23 20:34:19 +0000287 void removeModule(ModuleHandle H) {
Lang Hamese815bf32017-08-15 19:20:10 +0000288 cantFail(CompileLayer.removeModule(H));
Lang Hames7331cc32016-05-23 20:34:19 +0000289 }
290
Lang Hamesdb0551e2016-05-30 19:03:26 +0000291Now that we can add code to our JIT, we need a way to find the symbols we've
Lang Hamese815bf32017-08-15 19:20:10 +0000292added to it. To do that we call the findSymbol method on our CompileLayer, but
293with a twist: We have to *mangle* the name of the symbol we're searching for
294first. The ORC JIT components use mangled symbols internally the same way a
295static compiler and linker would, rather than using plain IR symbol names. This
296allows JIT'd code to interoperate easily with precompiled code in the
297application or shared libraries. The kind of mangling will depend on the
298DataLayout, which in turn depends on the target platform. To allow us to remain
299portable and search based on the un-mangled name, we just re-produce this
300mangling ourselves.
301
302Next we have a convenience function, getSymbolAddress, which returns the address
303of a given symbol. Like CompileLayer's addModule function, JITSymbol's getAddress
304function is allowed to fail [4]_, however we know that it will not in our simple
305example, so we wrap it in a call to cantFail.
Lang Hames7331cc32016-05-23 20:34:19 +0000306
Lang Hamesdb0551e2016-05-30 19:03:26 +0000307We now come to the last method in our JIT API: removeModule. This method is
308responsible for destructing the MemoryManager and SymbolResolver that were
309added with a given module, freeing any resources they were using in the
310process. In our Kaleidoscope demo we rely on this method to remove the module
311representing the most recent top-level expression, preventing it from being
312treated as a duplicate definition when the next top-level expression is
313entered. It is generally good to free any module that you know you won't need
314to call further, just to free up the resources dedicated to it. However, you
315don't strictly need to do this: All resources will be cleaned up when your
Lang Hamese815bf32017-08-15 19:20:10 +0000316JIT class is destructed, if they haven't been freed before then. Like
317``CompileLayer::addModule`` and ``JITSymbol::getAddress``, removeModule may
318fail in general but will never fail in our example, so we wrap it in a call to
319cantFail.
Lang Hamesdb0551e2016-05-30 19:03:26 +0000320
321This brings us to the end of Chapter 1 of Building a JIT. You now have a basic
322but fully functioning JIT stack that you can use to take LLVM IR and make it
323executable within the context of your JIT process. In the next chapter we'll
324look at how to extend this JIT to produce better quality code, and in the
325process take a deeper look at the ORC layer concept.
326
327`Next: Extending the KaleidoscopeJIT <BuildingAJIT2.html>`_
Lang Hames7331cc32016-05-23 20:34:19 +0000328
329Full Code Listing
330=================
331
Lang Hames9ed5f002016-05-25 23:42:48 +0000332Here is the complete code listing for our running example. To build this
333example, use:
Lang Hames7331cc32016-05-23 20:34:19 +0000334
335.. code-block:: bash
336
337 # Compile
338 clang++ -g toy.cpp `llvm-config --cxxflags --ldflags --system-libs --libs core orc native` -O3 -o toy
339 # Run
340 ./toy
341
342Here is the code:
343
344.. literalinclude:: ../../examples/Kaleidoscope/BuildingAJIT/Chapter1/KaleidoscopeJIT.h
345 :language: c++
346
Lang Hames7331cc32016-05-23 20:34:19 +0000347.. [1] Actually we use a cut-down version of KaleidoscopeJIT that makes a
348 simplifying assumption: symbols cannot be re-defined. This will make it
349 impossible to re-define symbols in the REPL, but will make our symbol
350 lookup logic simpler. Re-introducing support for symbol redefinition is
351 left as an exercise for the reader. (The KaleidoscopeJIT.h used in the
352 original tutorials will be a helpful reference).
353
354.. [2] +-----------------------+-----------------------------------------------+
355 | File | Reason for inclusion |
356 +=======================+===============================================+
Lang Hamese815bf32017-08-15 19:20:10 +0000357 | STLExtras.h | LLVM utilities that are useful when working |
358 | | with the STL. |
359 +-----------------------+-----------------------------------------------+
Lang Hames7331cc32016-05-23 20:34:19 +0000360 | ExecutionEngine.h | Access to the EngineBuilder::selectTarget |
361 | | method. |
362 +-----------------------+-----------------------------------------------+
363 | | Access to the |
364 | RTDyldMemoryManager.h | RTDyldMemoryManager::getSymbolAddressInProcess|
365 | | method. |
366 +-----------------------+-----------------------------------------------+
367 | CompileUtils.h | Provides the SimpleCompiler class. |
368 +-----------------------+-----------------------------------------------+
369 | IRCompileLayer.h | Provides the IRCompileLayer class. |
370 +-----------------------+-----------------------------------------------+
371 | | Access the createLambdaResolver function, |
372 | LambdaResolver.h | which provides easy construction of symbol |
373 | | resolvers. |
374 +-----------------------+-----------------------------------------------+
375 | ObjectLinkingLayer.h | Provides the ObjectLinkingLayer class. |
376 +-----------------------+-----------------------------------------------+
377 | Mangler.h | Provides the Mangler class for platform |
378 | | specific name-mangling. |
379 +-----------------------+-----------------------------------------------+
380 | DynamicLibrary.h | Provides the DynamicLibrary class, which |
381 | | makes symbols in the host process searchable. |
382 +-----------------------+-----------------------------------------------+
Lang Hamese815bf32017-08-15 19:20:10 +0000383 | | A fast output stream class. We use the |
384 | raw_ostream.h | raw_string_ostream subclass for symbol |
385 | | mangling |
386 +-----------------------+-----------------------------------------------+
387 | TargetMachine.h | LLVM target machine description class. |
388 +-----------------------+-----------------------------------------------+
Lang Hamese0fc5ae2016-05-25 22:27:25 +0000389
Lang Hamesdb0551e2016-05-30 19:03:26 +0000390.. [3] Actually they don't have to be lambdas, any object with a call operator
391 will do, including plain old functions or std::functions.
392
Lang Hamese815bf32017-08-15 19:20:10 +0000393.. [4] ``JITSymbol::getAddress`` will force the JIT to compile the definition of
394 the symbol if it hasn't already been compiled, and since the compilation
395 process could fail getAddress must be able to return this failure.