scope

lexical
every inner level can access its outer levels - widely accepted to bethe correct scoping decision.

dynamic
functions access the latest definition of a local variable - notdeducible at compile time level. This means that for some f wth the bodyx = 3, call g, g has x = 3 in scope.

from "history of t"

http://www.paulgraham.com/thist.html some of these notes are promptedby this

intro functional languages: rabbit (first scheme compiler),sussman/steele lambda papers, FP languages su ch as Hope, Miranda andScheme. franz lisp? Bliss language - c-level language from CMU? pdp-10hack: remove cons cell from freelist, updated freelist, and branching ifthe freelist was exhauste d to the gc in a single instruction! prematureoptimization is bad, no matter what. do not take shortcuts to simplifywhat is an already well- designed system! One key tip-off phrase isalways something of the form, "We'll throw out all the old cru ft, startover fresh, and just Do Things Right." fixnum :: a 'fixed number', ornative machine word building a compiler 'self-hosted', or incrementally– hosting one in another. how is it done? maclisp on -10 has mark andsweep run on the register set - "bibop" scheme with all objects boxed,and se gregated by type into pages of memory stop and copy garbagecollector: Cheney garbage collection algorithm - does bfs of heap tofind everythin g. t used Clark algorithm: dfs traversal, uses heap toprovide a search stack can implement mark and sweep with same costs asstop and copy! norman ramsey 'good' schemes use a range ofimplementations for lambas. depending on what we know about them atcompile time. some turn into nothing, some are control flow, some arestack frames, some cause heap allocation. must understand how compilerhandles everything, and scheme is built on this lambda structure lexemes:: construction of lexical meaning underlying words related throughinflection. the lemma is one forn of the lexeme standard procedure ::standardized code for a procedure? TODO compiler implementation as atree of objects that could link back to their parents? whoah

aaaa

these things are relatively communicative but i am not sure what theymean DFA :: data-flow analysis - ? loop-invariant hoisting :: - ? globalregister allocation :: - ? global common subexpression elimination :: -? find two things that are alpha-equivalent then make them a singleexpression copy propagation :: - ? induction-variable elimination :: - ?learn how to do more things with computer graphics! this area soundssuper interesting but i do not know much about it. assemblers :: normanadams. graph structure? linear text scheme? serializing graph tominimize the span f rom jmps to the things they jump to. push and pop vspush and a generational garbage collector assemblers :: norman adams.graph structure? linear text scheme? serializing graph to minimize thespan f rom jmps to the things they jump to. push and pop vs push and agenerational garbage collector beta-substitution :: - ? four classicanalyses:

  • go over programs forwards and backwards
  • unions and intersections

– union, intersection = always, mightpossibly keywords togoogle

  • available expressiosn

– forwards intersection

  • very busy expressions

– backwards intersection

  • reaching definitions

– forwards union

  • live variables

– backwards union

specific optimizations

dead code elimination
removes code not impacting program results. this shrinks program sizeand avoids irrelevant operations. this involves code that will never beexecuted and code that only effects dead variables – those that arenedver read from again. we can use this as opposed to optional codeinclusion via a preprocessor that would perform the same task. much ofthe dead code this finds is created by other transformations: forexample, strength reduction – we must ensure that the order of analysesis performed in the right way. historically, this is performed usinginformation derived from data-flow analysis with an algorithm based onsingle assignment form.

dynamic dead code elimination
some code sections represent dead or unreachable code only underspecific conditions. we can accumulate conditions at runtime todetermine what processes or loadings to eliminate while we evaluate anexpression.

strength reduction
most of a program's execution time is spent in a small section of code(the 'hot spot') and that code is inside a loop over and over. thecompiler looks to recognize things inside the loop. this records loopinvariants: values that do not change within the body of the loop, andinduction variables: values iterated each time through the loop.induction variables change by known amounts. we can replacemultiplication with successive weaker additions often which is anoptimization. we can hoist invariants outside of the loop, such asregisters. these are some good examples of strength reduction that canbe used: https://en.wikipedia.org/wiki/Strength_reduction

dataflow analysis
gathering information about values calculated at points during acomputer program. set up dataflow equations for each node of the cfg,then solve them by repeatedly calculating the output from the input,locally at each node until a fixpoint is reached – kildall's method.https://en.wikipedia.org/wiki/Data-flow_analysis

basic blocks :: - ? dynamic software updating :: - ? hot patching :: - ?maurice wilkes – helped build the electronic delay storage automaticcalculator – cool guy Loop-invariant code motion -WikipediaImprove Software Debugging with Binary Analysis \| some guy’sblogProgram analysisresources

Polymorphism

parametric polymorphism
allows a function or data type to be written generically so that it canhandle values identically without depending on their type these aregeneric functions, generic datatypes respectively. type of 'append' isgeneric but is parameratized with types rank 1.

polymorphism
type variables cannot be instantiated with polymorphic types.

rank k polymorphism
rank k polymorphism enforces that the quantifier may not appear ot theleft of k or more arrows type inference is decidable for rank 2, but notfor rank 3 or above

rank-n polymorphism
polymorphism in which quantifiers can appear to the left of arbitrarilymany arrows

Revisions
DateHash
2023-02-22
Navigation
Previousmtg
Nextevaluating-cities
Uppages