Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 1 | ======================================================= |
Lang Hames | 9d4ea6d | 2016-05-25 23:34:19 +0000 | [diff] [blame] | 2 | Building a JIT: Starting out with KaleidoscopeJIT |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 3 | ======================================================= |
| 4 | |
| 5 | .. contents:: |
| 6 | :local: |
| 7 | |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 8 | Chapter 1 Introduction |
| 9 | ====================== |
| 10 | |
| 11 | Welcome to Chapter 1 of the "Building an ORC-based JIT in LLVM" tutorial. This |
| 12 | tutorial runs through the implementation of a JIT compiler using LLVM's |
| 13 | On-Request-Compilation (ORC) APIs. It begins with a simplified version of the |
| 14 | KaleidoscopeJIT class used in the |
| 15 | `Implementing a language with LLVM <LangImpl1.html>`_ tutorials and then |
| 16 | introduces new features like optimization, lazy compilation and remote |
| 17 | execution. |
| 18 | |
| 19 | The goal of this tutorial is to introduce you to LLVM's ORC JIT APIs, show how |
| 20 | these APIs interact with other parts of LLVM, and to teach you how to recombine |
| 21 | them to build a custom JIT that is suited to your use-case. |
| 22 | |
| 23 | The structure of the tutorial is: |
| 24 | |
| 25 | - Chapter #1: Investigate the simple KaleidoscopeJIT class. This will |
| 26 | introduce some of the basic concepts of the ORC JIT APIs, including the |
| 27 | idea of an ORC *Layer*. |
| 28 | |
| 29 | - `Chapter #2 <BuildingAJIT2.html>`_: Extend the basic KaleidoscopeJIT by adding |
| 30 | a new layer that will optimize IR and generated code. |
| 31 | |
| 32 | - `Chapter #3 <BuildingAJIT3.html>`_: Further extend the JIT by adding a |
| 33 | Compile-On-Demand layer to lazily compile IR. |
| 34 | |
| 35 | - `Chapter #4 <BuildingAJIT4.html>`_: Improve the laziness of our JIT by |
| 36 | replacing the Compile-On-Demand layer with a custom layer that uses the ORC |
| 37 | Compile Callbacks API directly to defer IR-generation until functions are |
| 38 | called. |
| 39 | |
| 40 | - `Chapter #5 <BuildingAJIT5.html>`_: Add process isolation by JITing code into |
| 41 | a remote process with reduced privileges using the JIT Remote APIs. |
| 42 | |
| 43 | To provide input for our JIT we will use the Kaleidoscope REPL from |
| 44 | `Chapter 7 <LangImpl7.html>`_ of the "Implementing a language in LLVM tutorial", |
| 45 | with one minor modification: We will remove the FunctionPassManager from the |
| 46 | code for that chapter and replace it with optimization support in our JIT class |
| 47 | in Chapter #2. |
| 48 | |
| 49 | Finally, a word on API generations: ORC is the 3rd generation of LLVM JIT API. |
Sylvestre Ledru | 7d54050 | 2016-07-02 19:28:40 +0000 | [diff] [blame] | 50 | It was preceded by MCJIT, and before that by the (now deleted) legacy JIT. |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 51 | These tutorials don't assume any experience with these earlier APIs, but |
| 52 | readers acquainted with them will see many familiar elements. Where appropriate |
| 53 | we will make this connection with the earlier APIs explicit to help people who |
| 54 | are transitioning from them to ORC. |
| 55 | |
| 56 | JIT API Basics |
| 57 | ============== |
| 58 | |
| 59 | The purpose of a JIT compiler is to compile code "on-the-fly" as it is needed, |
| 60 | rather than compiling whole programs to disk ahead of time as a traditional |
| 61 | compiler does. To support that aim our initial, bare-bones JIT API will be: |
| 62 | |
| 63 | 1. Handle addModule(Module &M) -- Make the given IR module available for |
| 64 | execution. |
| 65 | 2. JITSymbol findSymbol(const std::string &Name) -- Search for pointers to |
| 66 | symbols (functions or variables) that have been added to the JIT. |
| 67 | 3. void removeModule(Handle H) -- Remove a module from the JIT, releasing any |
| 68 | memory that had been used for the compiled code. |
| 69 | |
| 70 | A basic use-case for this API, executing the 'main' function from a module, |
| 71 | will look like: |
| 72 | |
| 73 | .. code-block:: c++ |
| 74 | |
Lang Hames | 59a5ad8 | 2016-05-25 22:33:25 +0000 | [diff] [blame] | 75 | std::unique_ptr<Module> M = buildModule(); |
| 76 | JIT J; |
| 77 | Handle H = J.addModule(*M); |
| 78 | int (*Main)(int, char*[]) = |
| 79 | (int(*)(int, char*[])J.findSymbol("main").getAddress(); |
| 80 | int Result = Main(); |
| 81 | J.removeModule(H); |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 82 | |
| 83 | The APIs that we build in these tutorials will all be variations on this simple |
| 84 | theme. Behind the API we will refine the implementation of the JIT to add |
| 85 | support for optimization and lazy compilation. Eventually we will extend the |
| 86 | API itself to allow higher-level program representations (e.g. ASTs) to be |
| 87 | added to the JIT. |
| 88 | |
| 89 | KaleidoscopeJIT |
| 90 | =============== |
| 91 | |
| 92 | In the previous section we described our API, now we examine a simple |
| 93 | implementation of it: The KaleidoscopeJIT class [1]_ that was used in the |
| 94 | `Implementing a language with LLVM <LangImpl1.html>`_ tutorials. We will use |
| 95 | the REPL code from `Chapter 7 <LangImpl7.html>`_ of that tutorial to supply the |
| 96 | input for our JIT: Each time the user enters an expression the REPL will add a |
| 97 | new IR module containing the code for that expression to the JIT. If the |
| 98 | expression is a top-level expression like '1+1' or 'sin(x)', the REPL will also |
| 99 | use the findSymbol method of our JIT class find and execute the code for the |
| 100 | expression, and then use the removeModule method to remove the code again |
| 101 | (since there's no way to re-invoke an anonymous expression). In later chapters |
| 102 | of this tutorial we'll modify the REPL to enable new interactions with our JIT |
| 103 | class, but for now we will take this setup for granted and focus our attention on |
| 104 | the implementation of our JIT itself. |
| 105 | |
| 106 | Our KaleidoscopeJIT class is defined in the KaleidoscopeJIT.h header. After the |
| 107 | usual include guards and #includes [2]_, we get to the definition of our class: |
| 108 | |
| 109 | .. code-block:: c++ |
| 110 | |
Lang Hames | 59a5ad8 | 2016-05-25 22:33:25 +0000 | [diff] [blame] | 111 | #ifndef LLVM_EXECUTIONENGINE_ORC_KALEIDOSCOPEJIT_H |
| 112 | #define LLVM_EXECUTIONENGINE_ORC_KALEIDOSCOPEJIT_H |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 113 | |
Lang Hames | 59a5ad8 | 2016-05-25 22:33:25 +0000 | [diff] [blame] | 114 | #include "llvm/ExecutionEngine/ExecutionEngine.h" |
| 115 | #include "llvm/ExecutionEngine/RTDyldMemoryManager.h" |
| 116 | #include "llvm/ExecutionEngine/Orc/CompileUtils.h" |
| 117 | #include "llvm/ExecutionEngine/Orc/IRCompileLayer.h" |
| 118 | #include "llvm/ExecutionEngine/Orc/LambdaResolver.h" |
| 119 | #include "llvm/ExecutionEngine/Orc/ObjectLinkingLayer.h" |
| 120 | #include "llvm/IR/Mangler.h" |
| 121 | #include "llvm/Support/DynamicLibrary.h" |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 122 | |
Lang Hames | 59a5ad8 | 2016-05-25 22:33:25 +0000 | [diff] [blame] | 123 | namespace llvm { |
| 124 | namespace orc { |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 125 | |
Lang Hames | 59a5ad8 | 2016-05-25 22:33:25 +0000 | [diff] [blame] | 126 | class KaleidoscopeJIT { |
| 127 | private: |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 128 | |
Lang Hames | 59a5ad8 | 2016-05-25 22:33:25 +0000 | [diff] [blame] | 129 | std::unique_ptr<TargetMachine> TM; |
| 130 | const DataLayout DL; |
| 131 | ObjectLinkingLayer<> ObjectLayer; |
| 132 | IRCompileLayer<decltype(ObjectLayer)> CompileLayer; |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 133 | |
Lang Hames | 59a5ad8 | 2016-05-25 22:33:25 +0000 | [diff] [blame] | 134 | public: |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 135 | |
Lang Hames | 59a5ad8 | 2016-05-25 22:33:25 +0000 | [diff] [blame] | 136 | typedef decltype(CompileLayer)::ModuleSetHandleT ModuleHandleT; |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 137 | |
| 138 | Our class begins with four members: A TargetMachine, TM, which will be used |
| 139 | to build our LLVM compiler instance; A DataLayout, DL, which will be used for |
Lang Hames | db0551e | 2016-05-30 19:03:26 +0000 | [diff] [blame] | 140 | symbol mangling (more on that later), and two ORC *layers*: an |
| 141 | ObjectLinkingLayer and a IRCompileLayer. We'll be talking more about layers in |
| 142 | the next chapter, but for now you can think of them as analogous to LLVM |
Sylvestre Ledru | 7d54050 | 2016-07-02 19:28:40 +0000 | [diff] [blame] | 143 | Passes: they wrap up useful JIT utilities behind an easy to compose interface. |
Lang Hames | db0551e | 2016-05-30 19:03:26 +0000 | [diff] [blame] | 144 | The first layer, ObjectLinkingLayer, is the foundation of our JIT: it takes |
| 145 | in-memory object files produced by a compiler and links them on the fly to make |
| 146 | them executable. This JIT-on-top-of-a-linker design was introduced in MCJIT, |
| 147 | however the linker was hidden inside the MCJIT class. In ORC we expose the |
| 148 | linker so that clients can access and configure it directly if they need to. In |
| 149 | this tutorial our ObjectLinkingLayer will just be used to support the next layer |
| 150 | in our stack: the IRCompileLayer, which will be responsible for taking LLVM IR, |
| 151 | compiling it, and passing the resulting in-memory object files down to the |
| 152 | object linking layer below. |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 153 | |
Lang Hames | db0551e | 2016-05-30 19:03:26 +0000 | [diff] [blame] | 154 | That's it for member variables, after that we have a single typedef: |
| 155 | ModuleHandle. This is the handle type that will be returned from our JIT's |
| 156 | addModule method, and can be passed to the removeModule method to remove a |
| 157 | module. The IRCompileLayer class already provides a convenient handle type |
| 158 | (IRCompileLayer::ModuleSetHandleT), so we just alias our ModuleHandle to this. |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 159 | |
| 160 | .. code-block:: c++ |
| 161 | |
Lang Hames | 59a5ad8 | 2016-05-25 22:33:25 +0000 | [diff] [blame] | 162 | KaleidoscopeJIT() |
| 163 | : TM(EngineBuilder().selectTarget()), DL(TM->createDataLayout()), |
| 164 | CompileLayer(ObjectLayer, SimpleCompiler(*TM)) { |
| 165 | llvm::sys::DynamicLibrary::LoadLibraryPermanently(nullptr); |
| 166 | } |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 167 | |
Lang Hames | 59a5ad8 | 2016-05-25 22:33:25 +0000 | [diff] [blame] | 168 | TargetMachine &getTargetMachine() { return *TM; } |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 169 | |
| 170 | Next up we have our class constructor. We begin by initializing TM using the |
| 171 | EngineBuilder::selectTarget helper method, which constructs a TargetMachine for |
| 172 | the current process. Next we use our newly created TargetMachine to initialize |
| 173 | DL, our DataLayout. Then we initialize our IRCompileLayer. Our IRCompile layer |
| 174 | needs two things: (1) A reference to our object linking layer, and (2) a |
| 175 | compiler instance to use to perform the actual compilation from IR to object |
Lang Hames | db0551e | 2016-05-30 19:03:26 +0000 | [diff] [blame] | 176 | files. We use the off-the-shelf SimpleCompiler instance for now. Finally, in |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 177 | the body of the constructor, we call the DynamicLibrary::LoadLibraryPermanently |
| 178 | method with a nullptr argument. Normally the LoadLibraryPermanently method is |
| 179 | called with the path of a dynamic library to load, but when passed a null |
| 180 | pointer it will 'load' the host process itself, making its exported symbols |
| 181 | available for execution. |
| 182 | |
| 183 | .. code-block:: c++ |
| 184 | |
Lang Hames | e0fc5ae | 2016-05-25 22:27:25 +0000 | [diff] [blame] | 185 | ModuleHandle addModule(std::unique_ptr<Module> M) { |
| 186 | // Build our symbol resolver: |
| 187 | // Lambda 1: Look back into the JIT itself to find symbols that are part of |
| 188 | // the same "logical dylib". |
| 189 | // Lambda 2: Search for external symbols in the host process. |
| 190 | auto Resolver = createLambdaResolver( |
| 191 | [&](const std::string &Name) { |
| 192 | if (auto Sym = CompileLayer.findSymbol(Name, false)) |
Lang Hames | ad4a911 | 2016-08-01 20:49:11 +0000 | [diff] [blame] | 193 | return Sym; |
| 194 | return JITSymbol(nullptr); |
Lang Hames | e0fc5ae | 2016-05-25 22:27:25 +0000 | [diff] [blame] | 195 | }, |
| 196 | [](const std::string &S) { |
| 197 | if (auto SymAddr = |
| 198 | RTDyldMemoryManager::getSymbolAddressInProcess(Name)) |
Lang Hames | ad4a911 | 2016-08-01 20:49:11 +0000 | [diff] [blame] | 199 | return JITSymbol(SymAddr, JITSymbolFlags::Exported); |
| 200 | return JITSymbol(nullptr); |
Lang Hames | e0fc5ae | 2016-05-25 22:27:25 +0000 | [diff] [blame] | 201 | }); |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 202 | |
Lang Hames | e0fc5ae | 2016-05-25 22:27:25 +0000 | [diff] [blame] | 203 | // Build a singlton module set to hold our module. |
| 204 | std::vector<std::unique_ptr<Module>> Ms; |
| 205 | Ms.push_back(std::move(M)); |
| 206 | |
| 207 | // Add the set to the JIT with the resolver we created above and a newly |
| 208 | // created SectionMemoryManager. |
| 209 | return CompileLayer.addModuleSet(std::move(Ms), |
| 210 | make_unique<SectionMemoryManager>(), |
| 211 | std::move(Resolver)); |
| 212 | } |
| 213 | |
Lang Hames | db0551e | 2016-05-30 19:03:26 +0000 | [diff] [blame] | 214 | Now we come to the first of our JIT API methods: addModule. This method is |
| 215 | responsible for adding IR to the JIT and making it available for execution. In |
| 216 | this initial implementation of our JIT we will make our modules "available for |
| 217 | execution" by adding them straight to the IRCompileLayer, which will |
| 218 | immediately compile them. In later chapters we will teach our JIT to be lazier |
| 219 | and instead add the Modules to a "pending" list to be compiled if and when they |
| 220 | are first executed. |
Lang Hames | e0fc5ae | 2016-05-25 22:27:25 +0000 | [diff] [blame] | 221 | |
Lang Hames | db0551e | 2016-05-30 19:03:26 +0000 | [diff] [blame] | 222 | To add our module to the IRCompileLayer we need to supply two auxiliary objects |
| 223 | (as well as the module itself): a memory manager and a symbol resolver. The |
| 224 | memory manager will be responsible for managing the memory allocated to JIT'd |
| 225 | machine code, setting memory permissions, and registering exception handling |
| 226 | tables (if the JIT'd code uses exceptions). For our memory manager we will use |
| 227 | the SectionMemoryManager class: another off-the-shelf utility that provides all |
| 228 | the basic functionality we need. The second auxiliary class, the symbol |
| 229 | resolver, is more interesting for us. It exists to tell the JIT where to look |
| 230 | when it encounters an *external symbol* in the module we are adding. External |
| 231 | symbols are any symbol not defined within the module itself, including calls to |
| 232 | functions outside the JIT and calls to functions defined in other modules that |
| 233 | have already been added to the JIT. It may seem as though modules added to the |
| 234 | JIT should "know about one another" by default, but since we would still have to |
| 235 | supply a symbol resolver for references to code outside the JIT it turns out to |
| 236 | be easier to just re-use this one mechanism for all symbol resolution. This has |
| 237 | the added benefit that the user has full control over the symbol resolution |
| 238 | process. Should we search for definitions within the JIT first, then fall back |
| 239 | on external definitions? Or should we prefer external definitions where |
| 240 | available and only JIT code if we don't already have an available |
| 241 | implementation? By using a single symbol resolution scheme we are free to choose |
| 242 | whatever makes the most sense for any given use case. |
Lang Hames | e0fc5ae | 2016-05-25 22:27:25 +0000 | [diff] [blame] | 243 | |
Lang Hames | db0551e | 2016-05-30 19:03:26 +0000 | [diff] [blame] | 244 | Building a symbol resolver is made especially easy by the *createLambdaResolver* |
Lang Hames | ad4a911 | 2016-08-01 20:49:11 +0000 | [diff] [blame] | 245 | function. This function takes two lambdas [3]_ and returns a JITSymbolResolver |
| 246 | instance. The first lambda is used as the implementation of the resolver's |
| 247 | findSymbolInLogicalDylib method, which searches for symbol definitions that |
| 248 | should be thought of as being part of the same "logical" dynamic library as this |
| 249 | Module. If you are familiar with static linking: this means that |
| 250 | findSymbolInLogicalDylib should expose symbols with common linkage and hidden |
| 251 | visibility. If all this sounds foreign you can ignore the details and just |
| 252 | remember that this is the first method that the linker will use to try to find a |
| 253 | symbol definition. If the findSymbolInLogicalDylib method returns a null result |
| 254 | then the linker will call the second symbol resolver method, called findSymbol, |
| 255 | which searches for symbols that should be thought of as external to (but |
| 256 | visibile from) the module and its logical dylib. In this tutorial we will adopt |
| 257 | the following simple scheme: All modules added to the JIT will behave as if they |
| 258 | were linked into a single, ever-growing logical dylib. To implement this our |
| 259 | first lambda (the one defining findSymbolInLogicalDylib) will just search for |
| 260 | JIT'd code by calling the CompileLayer's findSymbol method. If we don't find a |
| 261 | symbol in the JIT itself we'll fall back to our second lambda, which implements |
| 262 | findSymbol. This will use the RTDyldMemoyrManager::getSymbolAddressInProcess |
| 263 | method to search for the symbol within the program itself. If we can't find a |
| 264 | symbol definition via either of these paths the JIT will refuse to accept our |
| 265 | module, returning a "symbol not found" error. |
Lang Hames | e0fc5ae | 2016-05-25 22:27:25 +0000 | [diff] [blame] | 266 | |
| 267 | Now that we've built our symbol resolver we're ready to add our module to the |
Lang Hames | db0551e | 2016-05-30 19:03:26 +0000 | [diff] [blame] | 268 | JIT. We do this by calling the CompileLayer's addModuleSet method [4]_. Since |
Lang Hames | e0fc5ae | 2016-05-25 22:27:25 +0000 | [diff] [blame] | 269 | we only have a single Module and addModuleSet expects a collection, we will |
| 270 | create a vector of modules and add our module as the only member. Since we |
| 271 | have already typedef'd our ModuleHandle type to be the same as the |
| 272 | CompileLayer's handle type, we can return the handle from addModuleSet |
| 273 | directly from our addModule method. |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 274 | |
| 275 | .. code-block:: c++ |
| 276 | |
| 277 | JITSymbol findSymbol(const std::string Name) { |
| 278 | std::string MangledName; |
| 279 | raw_string_ostream MangledNameStream(MangledName); |
| 280 | Mangler::getNameWithPrefix(MangledNameStream, Name, DL); |
| 281 | return CompileLayer.findSymbol(MangledNameStream.str(), true); |
| 282 | } |
| 283 | |
| 284 | void removeModule(ModuleHandle H) { |
| 285 | CompileLayer.removeModuleSet(H); |
| 286 | } |
| 287 | |
Lang Hames | db0551e | 2016-05-30 19:03:26 +0000 | [diff] [blame] | 288 | Now that we can add code to our JIT, we need a way to find the symbols we've |
| 289 | added to it. To do that we call the findSymbol method on our IRCompileLayer, |
| 290 | but with a twist: We have to *mangle* the name of the symbol we're searching |
| 291 | for first. The reason for this is that the ORC JIT components use mangled |
| 292 | symbols internally the same way a static compiler and linker would, rather |
| 293 | than using plain IR symbol names. The kind of mangling will depend on the |
| 294 | DataLayout, which in turn depends on the target platform. To allow us to |
| 295 | remain portable and search based on the un-mangled name, we just re-produce |
| 296 | this mangling ourselves. |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 297 | |
Lang Hames | db0551e | 2016-05-30 19:03:26 +0000 | [diff] [blame] | 298 | We now come to the last method in our JIT API: removeModule. This method is |
| 299 | responsible for destructing the MemoryManager and SymbolResolver that were |
| 300 | added with a given module, freeing any resources they were using in the |
| 301 | process. In our Kaleidoscope demo we rely on this method to remove the module |
| 302 | representing the most recent top-level expression, preventing it from being |
| 303 | treated as a duplicate definition when the next top-level expression is |
| 304 | entered. It is generally good to free any module that you know you won't need |
| 305 | to call further, just to free up the resources dedicated to it. However, you |
| 306 | don't strictly need to do this: All resources will be cleaned up when your |
| 307 | JIT class is destructed, if the haven't been freed before then. |
| 308 | |
| 309 | This brings us to the end of Chapter 1 of Building a JIT. You now have a basic |
| 310 | but fully functioning JIT stack that you can use to take LLVM IR and make it |
| 311 | executable within the context of your JIT process. In the next chapter we'll |
| 312 | look at how to extend this JIT to produce better quality code, and in the |
| 313 | process take a deeper look at the ORC layer concept. |
| 314 | |
| 315 | `Next: Extending the KaleidoscopeJIT <BuildingAJIT2.html>`_ |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 316 | |
| 317 | Full Code Listing |
| 318 | ================= |
| 319 | |
Lang Hames | 9ed5f00 | 2016-05-25 23:42:48 +0000 | [diff] [blame] | 320 | Here is the complete code listing for our running example. To build this |
| 321 | example, use: |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 322 | |
| 323 | .. code-block:: bash |
| 324 | |
| 325 | # Compile |
| 326 | clang++ -g toy.cpp `llvm-config --cxxflags --ldflags --system-libs --libs core orc native` -O3 -o toy |
| 327 | # Run |
| 328 | ./toy |
| 329 | |
| 330 | Here is the code: |
| 331 | |
| 332 | .. literalinclude:: ../../examples/Kaleidoscope/BuildingAJIT/Chapter1/KaleidoscopeJIT.h |
| 333 | :language: c++ |
| 334 | |
Lang Hames | 7331cc3 | 2016-05-23 20:34:19 +0000 | [diff] [blame] | 335 | .. [1] Actually we use a cut-down version of KaleidoscopeJIT that makes a |
| 336 | simplifying assumption: symbols cannot be re-defined. This will make it |
| 337 | impossible to re-define symbols in the REPL, but will make our symbol |
| 338 | lookup logic simpler. Re-introducing support for symbol redefinition is |
| 339 | left as an exercise for the reader. (The KaleidoscopeJIT.h used in the |
| 340 | original tutorials will be a helpful reference). |
| 341 | |
| 342 | .. [2] +-----------------------+-----------------------------------------------+ |
| 343 | | File | Reason for inclusion | |
| 344 | +=======================+===============================================+ |
| 345 | | ExecutionEngine.h | Access to the EngineBuilder::selectTarget | |
| 346 | | | method. | |
| 347 | +-----------------------+-----------------------------------------------+ |
| 348 | | | Access to the | |
| 349 | | RTDyldMemoryManager.h | RTDyldMemoryManager::getSymbolAddressInProcess| |
| 350 | | | method. | |
| 351 | +-----------------------+-----------------------------------------------+ |
| 352 | | CompileUtils.h | Provides the SimpleCompiler class. | |
| 353 | +-----------------------+-----------------------------------------------+ |
| 354 | | IRCompileLayer.h | Provides the IRCompileLayer class. | |
| 355 | +-----------------------+-----------------------------------------------+ |
| 356 | | | Access the createLambdaResolver function, | |
| 357 | | LambdaResolver.h | which provides easy construction of symbol | |
| 358 | | | resolvers. | |
| 359 | +-----------------------+-----------------------------------------------+ |
| 360 | | ObjectLinkingLayer.h | Provides the ObjectLinkingLayer class. | |
| 361 | +-----------------------+-----------------------------------------------+ |
| 362 | | Mangler.h | Provides the Mangler class for platform | |
| 363 | | | specific name-mangling. | |
| 364 | +-----------------------+-----------------------------------------------+ |
| 365 | | DynamicLibrary.h | Provides the DynamicLibrary class, which | |
| 366 | | | makes symbols in the host process searchable. | |
| 367 | +-----------------------+-----------------------------------------------+ |
Lang Hames | e0fc5ae | 2016-05-25 22:27:25 +0000 | [diff] [blame] | 368 | |
Lang Hames | db0551e | 2016-05-30 19:03:26 +0000 | [diff] [blame] | 369 | .. [3] Actually they don't have to be lambdas, any object with a call operator |
| 370 | will do, including plain old functions or std::functions. |
| 371 | |
| 372 | .. [4] ORC layers accept sets of Modules, rather than individual ones, so that |
Lang Hames | e0fc5ae | 2016-05-25 22:27:25 +0000 | [diff] [blame] | 373 | all Modules in the set could be co-located by the memory manager, though |
| 374 | this feature is not yet implemented. |