[Richard Suchenwirth] 2002-11-03 - One of the amazing things about Tcl is that one can again and again discover new ways to use the built-in concepts: in this case, [array] and variable [trace]s. Functions are typically implemented as [proc]s, but here is an alternative - an array that acts like a function: it returns the result when the function body is applied to its arguments (the key). Like in [expr] functions, arguments are separated by commas instead of whitespace, to please the dollar parser which doesn't group on parens: the command set f(x y) would assign the value "y)" to a scalar variable named "f(x" ! Instead, you call such a function with e.g. two arguments like this: puts $f($x,$y) The first dollar sign substitutes it as a variable (which it really is), but the associated trace computes the function value, if the arguments are not an array element, as implemented in the few lines of code below. A function definition is modeled like a [proc] definition (no default values or ''args'', though). We observe the following features: * the body is [eval]ed when necessary, hence not byte-compiled, hence slower * previous results are cached, which may save time on lengthy calculations * the cache will consume (but not leak!) memory if the function is called often with different arguments * variable scope applies: functions defined at global scope must be marked as such when called inside a [proc] * otoh, functions defined in a proc are truly local: not visible from elsewhere (except via [uplevel]); the array as well as its trace disappear on [return] from the defining proc Like often, I'm not so sure what this construct is good for, compared to [proc]s, but it sure made a nice Saturday evening braintwister. From the features above, I conclude that this may be helpful if you have calculations that run long even if byte-compiled, and call them repeatedly with same arguments, so caching makes sense. Also, you introduce the C-like calling style, with arguments separated by commas in parens, but this is still Tcl - and experience has taught me that it's best not to stray too far from [the Tcl way].- Another consideration: In [Braintwisters], "dynamic variables" were introduced whose value is computed when needed, and may change over time. At that time, I thought them "as powerful as an argument-less [proc]" - the array-based functions described here are (almost) as powerful as a [proc] with arguments. Just slower - but then again, maybe this little experiment may still be useful for whatever you're doing in Tcl... } proc function {name argl body} { upvar 1 $name f array set f {} ;# make it exist (but add no elements) trace variable f rw [list runFunction $argl $body] } proc runFunction {argl body name el op} { if {$op=="w"} {error "can't set function value for $name"} upvar 1 $name f if {![info exists f($el)]} { foreach $argl [split $el ,] break set f($el) [eval $body] } else {set f($el)} } # Testing... (expected results in front of each puts) #--define an array-based function function sq2 {x z} {expr {$x*$x + $z}} #-- try to call it from global scope puts "42? $sq2(6,6)" #-- try to call it from inside a proc proc test1 {} {puts $sq2(7,7)} catch test1 res puts "error? $res" ;# should raise "unknown variable" error #-- try again, but marked as global this time proc test2 {} {puts "56? $::sq2(7,7)"} test2 #-- try a function defined locally inside a proc proc test3 {} { function local {x y z} {expr {$x*$y-$z}} puts "5? $local(3,2,1)" } test3 #-- try to call it from global scope catch {set local(4,3,2)} res puts "error? $res" ;# should raise the same error ---- ''[escargo] 11/04/2002:'' I have heard this kind of behavior as ''memoizing'', and it is something used in other languages as well: * Python [http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/52201] * Scheme [http://www.cs.oberlin.edu/classes/dragn/labs/dynamic/dynamic10.html]