Jasper wrote:What is open-coding exactly(sorry

:p ) whatever it is, inlining achieves it.(Right?)
Open-coding is like a very primitive form of inlining. The main difference is that it's performed directly by the compiler and thus can be even more efficient for certain operations. Essentially, the compiler is programmed to understand certain primitive functions (cons, car, cdr, length, etc.) and it just spits out the exact right machine code for those at the call site. Like inlining, it eliminates the function call, but unlike inlining, it doesn't rely on a bunch of compiler analysis to optimize things further. Because the compiler author
knows exactly how standard primitives like car or length work, he can optimize them by hand when the compiler is written.
Maybe i should be more specific: The sequence to expand functions as function of functions:
- Look for funcalls, where did the argument for the funcall come from?
- A variable: See where variable came from, do one of below for what you find. (Might not be possible if turns out depending on non-constant functions, or if someone is iterating by changing a function or something like that, then again you can't compare with macros at that point; macros have no analogy except going through eval.) If used more then once, might want to expand into a local flet to let lisp decide whether to inline.
- (funcall (lambda(,@args) ,@body) ,@got-> expand to (let (,@(combine args got)) ,@body), and then repeat the looking for funcalls.
- (funcall (some-fun ,@args) ,@got) -> expand some-fun until you either find you can't, or find the final lambda. When latter do above.
- (funcall #',fun-name ,@got) and (funcall ',fun-name ,@got) Easy, (,fun-name ,@got)
Edit: I forgot mentioning functions taking arguments inside the function to be inlined, you can inline those aswel, using the same method.
I guess i have not covered apply yet.
Most compilers already perform various transformations like this. Basic constant expressions are typically evaluated during compilation (e.g., things like (+ 2 2) is simply replaced with 4). This is not, however, what Olin was talking about. There are many cases when the call site cannot be optimized because you don't know what to inline within it. For instance, if I write
(mapcar #'xyzzy '(1 2 3)), and xyzzy is defined in another compilation unit, the compiler, even a very smart compiler, has no chance to optimize anything. While a good compiler can open-code the instructions for mapcar (essentially doing the equivalent of
(loop for temp in '(1 2 3) collect (funcall #'xyzzy temp)) ), it still must call xyzzy each time. If xyzzy is in the same compilation unit
and it is declared to be inline, then
perhaps a smart compiler will inline it in this instance, but realize that inline directives are hints, not orders. A compiler is not required to inline anything, no matter how many inline and speed directives you give it.
As for user-written intermediate sequences, i don't know of anything that can be done about that, besides define-compiler-macros specific to the functions. To be honest, functions taking functions as arguments and having as their way of output what they return strikes me as the wrong way to do it, except for very simple stuff like lists. If your program is a little more complicated, you have to think about how to collect the data to return together. It is like connecting each of the houses to the powerplant individually. You wouldn't have to do that if you just let the function return nothing(or maybe some information it stumbles upon and would be wasteful to waste.), letting the callbacks do the work like collection.
This is the kind of thing that Shivers was talking about, I think. There are many cases where the simple, obvious, static analysis falls short. Among these cases are many interesting ones (not strange, theoretical corner cases, for instance, but real practical problems). Devices like Common Lisp compiler macros were created, I think, to allow application writers to provide application-specific smarts to a standard compiler to help it along through some of the optimizations. While we can always theorize that a compiler
should be able to do such and such, some of the analysis can get very complex very quickly and may be very error prone to handle the general cases. Many compilers fall well short of what might be theoretically possible.