In thinking about the general issue of scripting versus programming I decided to look for some examples of early BASIC to see how close that really was to what we would now consider a simple scripting tool -i.e. an interpreted language without a requirement for static typing. To my surprise one of the first hits produced by google included a paragraph I couldn't not quote after last Monday's disquisition on whether Bill Gates could be considered a programmer or not. It's from Mactech.com and applies to the state of play some time in 1986:
Accuracy Benchmark
40 time1=TIMER :REM Use TIME for Softworks Basic
50 s=0
100 x=0
200 FOR n=1 TO 1000
300 s=s+x*x
400 x=x+.00123
500 NEXT n
600 PRINT s,x
700 time2=TIMER
800 Totaltime=time2-time1
900 PRINT "Elapsed Time =",Totaltime,"Seconds."
This same benchmark has been run on various desk top computers to compare their speed and accuracy in engineering or scientific applications. Most desk top computers convert real numbers to their binary equivalent before performing software math operations. This introduces an error depending on the precision of the math routines that can propagate through many math operations, producing a considerable error in the final result. This benchmark program, developed by R. Broucke at the University of Texas, Austin, tests for this error propagation. See Table 1 for the results on different computers.
Accuracy in these computers is a function of the number of bytes used to represent the mantissa. A three byte mantissa is used in most versions of Microsoft BASIC (single precision). This is illustrated by the systems that give 503.545 for an answer. The TRS-80, Altair 8800, Osborne MBASIC and IBM personal computer are typical of this version of Microsoft BASIC. Those versions of Microsoft BASIC implemented on a 6502 microprocessor typically use a four-byte mantissa, giving 503.543832 or something similar as the answer. Note that only those computers NOT using a Microsoft version of BASIC come up with the right answer ...
[Emphasis added].
Page two of this thing is missing along with the data tables, but the piece that's still there does contain several other examples of BASIC programming - including an example the fits the comparison I wanted to make:
ON MENU GOSUB menuevent
MENU ON
loop:
GOTO loop
HandleAct:MENU STOP:MOUSE STOP
ACT=DIALOG(0)
IF ACT=5 THEN GOSUB Setupwindow
MENU ON: MOUSE ON
RETURN
menuevent:
menunumber=MENU(0)
menuitem=MENU(1):MENU
ON menunumber GOSUB menu1,menu2,menu3
RETURN
menu1:
IF menuitem=1 THEN Quit
RETURN
menu2:
'This is the Edit menu
'use it for DA's only
RETURN
Now if we consider the essence of a scripting language to be its ability to handle dynamic typing then this has to be considered scripting, not programming. In those days, i.e. before AT&T lost out on its opportunity to own the development tools market by offering C with vi as Visual C, BASIC was pretty much a scalar language, but that's changed since with type constructions like "Dim ExchangeDocumentCounter As Long" (;-)) entering the language as a consequence of both objectification and the inclusion of some non scalar array extensions.
But, I think we have to ask whether this is sufficient to elevate BASIC from scripting to programming or whether it just raises questions about the adequacy and value of using the static versus dynamic differentiation as a means of telling scripting from programming. If the latter, then perhaps we could argue that scripting languages, when sufficiently successful, eventually become programming languages through a kind of seamless, eventless, growth process - and then point to Perl, Python, and PHP as examples where the same thing has happened or is happening.
I rather think so, and suggest that there's a nice inverse rule too: good programming languages, if commercially successful, eventually become scripting languages through a kind of seamless, eventless, growth process - meaning that you'll be scripting in Java before the decade is out!