linux - How does JIT compilation in Java load dynamically compiled instructions into memory? -
in java, jvms (e.g. hotspot) capable of jit compilation , technique used speed execution compiling bytecode native code. question is, how technically happen? understanding modern processors mark memory areas sections read-only, , sections executable in order prevent malicious code executing. so, jvm can't write new "executable code" memory spaces has access (i.e. self modifying code). so, guessing jvm produces native code, writes file , uses operating systems services dynamically load native code memory, , maintains internal mapping table of addresses of native code (function) locations in memory after operating system has loaded dynamic code can branch out native instructions.
i did see answer: how jit compiled code injected in memory , executed?, i'm confused why operating systems allow user programs read+execute memory regions. other operating systems i.e. linux etc offer similar in order jit work?
can clarify understanding?
in linux, memory segment can set writable , executable (and can later changed on protections). @ mmap(2) , mprotect(2) syscalls.
the jvm produce machine code in memory, without using disk files. jit machinery write bytes in executable memory.
notice jvm might not want change generated machine code protection (it generate machine code inside writable , executable memory segments), because since generating code, can made sure not doing nasty things (read proof-carrying code).
read just-in-time compilation , hotspot , virtual memory wiki pages, , try strace
-ing java
process...
some jvms free software (e.g. 1 inside openjdk), study source code.
Comments
Post a Comment