Post by Lawrence D'OliveiroAt one time, we distinguished between “scripting” languages and
“programming” languages. To begin with, the “scripting” languages were
somehow more limited in functionality than full-fledged “programming”
languages. Or they were slower, because they were interpreted.
I don't think there has ever been a clear distinction.
A "script" is usually small and often written for a particular task on a
particular system, while a "program" might be bigger and more generic,
runnable on multiple systems by multiple people. But there is no
dividing point in the type of code, and plenty of overlap - even though
the difference is often clear ("This is a script for making backups of
my servers - it uses the rsync program to do the bulk of the work").
Similarly, there are not "scripting languages" and "programming
languages". There are languages that are more suitable for script work,
and languages that are more suitable for programming work, and languages
that are suitable for both.
Then there are "interpreted" languages and "compiled" languages. As you
say, this is not a binary distinction - there are shades between this,
especially with byte compiling. Some languages, such as Python, are
used like interpreted language (you "run" the source code) but are
byte-compiled on the fly. Some, like Java, are used like compiled
languages but generate byte code that is interpreted. Others use some
byte-compiled code along with JIT machine code to blur the lines even more.
It is fair to say that "scripts" are usually written in interpreted
languages (or languages designed to look like they interpreted, by
compiling or byte-compiling on the fly). "Programs" can be written in
interpreted or compiled languages - there is no consensus.
Post by Lawrence D'OliveiroThen languages like Perl and Java came along: both were compiled to a
bytecode, a sort of pseudo-machine-language, which was interpreted by
software, not CPU hardware. Were they “scripting” or “programming”
languages? Some might have classed Perl as a “scripting” language to
begin with, but given it is must as powerful as Java, then why
shouldn’t Java also be considered a “scripting” rather than
“programming” language? And before these two, there was UCSD Pascal,
which was probably the pioneer of this compile-to-bytecode idea.
Such classification is just wrong, IMHO. You can write scripts in Perl,
and you can write programs in Perl. "APL" is invariably (AFAIK)
interpreted, and it is for programming rather than scripting - the
acronym stands for "A Programming Language".
And of course there are many computer languages whose prime purpose is
other tasks, even though they can be used for programming - TeX and
Postscript are examples.
Post by Lawrence D'OliveiroSo that terminology for distinguishing between classes of programming
languages became largely obsolete.
I am not at all convinced it was ever relevant to distinguish between
"scripting languages" and "programming languages". It was useful to
distinguish between "interpreted" and "compiled" languages, and the
overlap and blurring has increased there.
Post by Lawrence D'OliveiroBut there is one distinction that I think is still relevant, and that
is the one between shell/command languages and programming languages.
In a shell language, everything you type is assumed to be a literal
string, unless you use special substitution sequences. E.g. in a POSIX
ls -l thingy
“give me information about the file/directory named ‘thingy’”, vs.
ls -l $thingy
“give me information about the files/directories whose names are in
the value of the variable ‘thingy’”.
Whereas in a programming language, everything is assumed to be a
language construct, and every unadorned name is assumed to reference
some value/object, so you need quote marks to demarcate literal
os.listdir(thingy)
“return a list of the contents of the directory whose name is in the
variable ‘thingy’”, vs.
os.listdir("thingy")
“return a list of the contents of the directory named ‘thingy’”.
This difference in design has to do with their typical usage: most of
the use of a shell/command language is in typing a single command at a
time, for immediate execution. Whereas a programming language is
typically used to construct sequences consisting of multiple lines of
code before they are executed.
That is arguably a useful distinction in the style of programming
languages, and this difference makes the language more or less suited to
particular tasks (such as typical short scripts).
Again, however, there are exceptions that mean a clear binary
distinction is not possible. Knuth did a lot of work on "literary
programming", where documentation and source is combined along with
executable code, and used such languages and tools for programs like TeX
and Metafont. ("Linux from Scratch" is another example.)
TCL is a language that might be considered half-way between your
categories here.
Post by Lawrence D'OliveiroThis difference is also why attempts to use programming languages as
though they were shell/command languages, entering and executing a
single line of code at a time, tend to end up being more trouble than
they are worth.
Conversely, using shell/command languages as programming languages, by
collecting multiple lines of code into shell scripts, does work, but
only up to a point. The concept of variable substitution via string
substitution tends to lead to trouble when trying to do more advanced
data manipulations.
So, in short, while there is some overlap in their applicable usage
areas, they are still very much oriented to different application
scenarios.
<https://en.wikipedia.org/wiki/List_of_programming_languages_by_type>
gives something like 40 categories of programming languages, of which
"scripting languages" is one type. I think any attempt at dividing up
programming languages will either be so full of grey areas as to be
almost useless, or have so many categories that it is almost useless.
The best you can do is pick some characteristics of languages, or some
typical use-cases of languages, and ask if any given language fits there.