Discussion:
Experimental C Build System
(too old to reply)
vallor
2024-01-31 16:41:21 UTC
Permalink
On Tue, 30 Jan 2024 19:22:00 +0000, Richard Harnden
But I'm wondering about one file which contains all the sources for the
program. Like an IDE project file but lighter weight.
In other words: a Makefile
Agreed; it's a solution looking for a problem.

$ make -j # how does Bart's new build manager handle this case?

("-j" engages parallel compilation.)

ObC:
$ cat try.c
#include <stdlib.h>
int main(void) {
return(system("make -j 16"));
}
_ _ _ _ _ _ _

$ cat Makefile
CFLAGS=-g -O2 -std=c90 -pedantic
_ _ _ _ _ _ _

$ make try
cc -g -O2 -std=c90 -pedantic try.c -o try

$ ./try
make: 'try' is up to date.
--
-v
Nicolas George
2024-01-31 18:55:58 UTC
Permalink
Post by vallor
return(system("make -j 16"));
The return value of system() is not in the same format as the return value
of main().
Scott Lurndal
2024-01-31 19:15:22 UTC
Permalink
Post by Nicolas George
Post by vallor
return(system("make -j 16"));
The return value of system() is not in the same format as the return value
of main().
Regardless, it suffices for a "success/failure" indication.
Lew Pitcher
2024-01-31 19:21:56 UTC
Permalink
Post by Scott Lurndal
Post by Nicolas George
Post by vallor
return(system("make -j 16"));
The return value of system() is not in the same format as the return value
of main().
Regardless, it suffices for a "success/failure" indication.
Mostly. The only exception looks to be very rare: if no shell
is available, then system() returns 0. Thus, it is possible
(but, IMHO, highly unlikely) to have a failure condition that
reports as success.
--
Lew Pitcher
"In Skills We Trust"
Lawrence D'Oliveiro
2024-01-31 21:18:49 UTC
Permalink
Post by Scott Lurndal
Post by Nicolas George
Post by vallor
return(system("make -j 16"));
The return value of system() is not in the same format as the return
value of main().
Regardless, it suffices for a "success/failure" indication.
Mostly. The only exception looks to be very rare: if no shell is
available, then system() returns 0.
Why does this need to go through a shell?
Scott Lurndal
2024-01-31 21:46:34 UTC
Permalink
Post by Lawrence D'Oliveiro
Post by Scott Lurndal
Post by Nicolas George
Post by vallor
return(system("make -j 16"));
The return value of system() is not in the same format as the return
value of main().
Regardless, it suffices for a "success/failure" indication.
Mostly. The only exception looks to be very rare: if no shell is
available, then system() returns 0.
Why does this need to go through a shell?
Something needs to parse and execute the command line.

$ man 3 system
Lawrence D'Oliveiro
2024-02-01 00:25:53 UTC
Permalink
Post by Scott Lurndal
Post by Lawrence D'Oliveiro
Post by vallor
return(system("make -j 16"));
Why does this need to go through a shell?
Something needs to parse and execute the command line.
There is no need to “parse” any command line.

<manpages.debian.org/3/exec.3.en.html>
Lew Pitcher
2024-02-01 00:54:33 UTC
Permalink
Post by Lawrence D'Oliveiro
Post by Scott Lurndal
Post by Lawrence D'Oliveiro
Post by vallor
return(system("make -j 16"));
Why does this need to go through a shell?
Something needs to parse and execute the command line.
There is no need to “parse” any command line.
<manpages.debian.org/3/exec.3.en.html>
Lawrence, you will note that, in the original call
return(system("make -j 16"));
the argument to system() is a single string.

The equivalent exec() call takes a larger number of
arguments, including
a) the path to the program to be executed (in this case
something like "/usr/bin/make")
b) an array (or, for some of the exec() family, a list)
of pointers to strings representing each of the
arguments given to the program to be executed.
In this case, something like "make", "-j", "16", NULL
c) in some of the exec family, an array of environment
variable strings.

Do you see any of those /individual/ strings in the OP's
given argument to system()? I don't.

Yes, the OP /could/ have coded a fork/exec/wait codepath
and "hand parsed" the commandline, /BUT/ the OP instead
chose to use a handy utility function that does all that
for the cost of invoking a shell to parse out and build
the arguments to exec().


Dude. Get a handle on it. You don't have to be so argumentative
about a simple choice like that. No one is saying that
system() is the only way to go in a case like this, and
you seem to be over-reacting to this simple piece of code.
--
Lew Pitcher
"In Skills We Trust"
Scott Lurndal
2024-02-01 01:31:30 UTC
Permalink
Post by Lew Pitcher
Post by Scott Lurndal
Post by Lawrence D'Oliveiro
Post by vallor
return(system("make -j 16"));
Why does this need to go through a shell?
Something needs to parse and execute the command line.
There is no need to “parse” any command line.
<manpages.debian.org/3/exec.3.en.html>
Lawrence, you will note that, in the original call
return(system("make -j 16"));
the argument to system() is a single string.
The equivalent exec() call takes a larger number of
arguments, including
a) the path to the program to be executed (in this case
something like "/usr/bin/make")
b) an array (or, for some of the exec() family, a list)
of pointers to strings representing each of the
arguments given to the program to be executed.
In this case, something like "make", "-j", "16", NULL
c) in some of the exec family, an array of environment
variable strings.
Do you see any of those /individual/ strings in the OP's
given argument to system()? I don't.
Yes, the OP /could/ have coded a fork/exec/wait codepath
and "hand parsed" the commandline, /BUT/ the OP instead
chose to use a handy utility function that does all that
for the cost of invoking a shell to parse out and build
the arguments to exec().
Dude. Get a handle on it. You don't have to be so argumentative
about a simple choice like that. No one is saying that
system() is the only way to go in a case like this, and
you seem to be over-reacting to this simple piece of code.
Not to mention that
1) there are no html versions of the man pages installed.
2) even if there were, it's still a markup language that needs processing to be useful.
3) not everyone has an internet connection or wants to leave the terminal session to view a man
page. In fact most of the machines we run on are not connected to the internet at all
and the manpages for our application won't be on the internet anyway.
Lawrence D'Oliveiro
2024-02-01 03:01:56 UTC
Permalink
Post by Lew Pitcher
Lawrence, you will note that, in the original call
return(system("make -j 16"));
the argument to system() is a single string.
Which is completely unnecessary.
Post by Lew Pitcher
The equivalent exec() call takes a larger number of
arguments, including ...
And is safer to use.

People need to stop requiring the interposition of a shell to invoke
commands, unless there really is a need for it.
Keith Thompson
2024-01-31 22:29:50 UTC
Permalink
Post by Lawrence D'Oliveiro
Post by Scott Lurndal
Post by Nicolas George
Post by vallor
return(system("make -j 16"));
The return value of system() is not in the same format as the return
value of main().
Regardless, it suffices for a "success/failure" indication.
Mostly. The only exception looks to be very rare: if no shell is
available, then system() returns 0.
Why does this need to go through a shell?
Since this is comp.unix.programmer, POSIX specifies that system()
invokes the "sh" utility as if by calling fork() and exec().

In comp.lang.c, I'd say that system() determines whether the host system
has a *command processor*, and if so passes the string pointed to by its
argument to that command processor (i.e., shell).
--
Keith Thompson (The_Other_Keith) Keith.S.Thompson+***@gmail.com
Working, but not speaking, for Medtronic
void Void(void) { Void(); } /* The recursive call of the void */
Nicolas George
2024-01-31 23:24:07 UTC
Permalink
Post by Lew Pitcher
Mostly. The only exception looks to be very rare
Only exception?

Upon normal completion, system() returns the exit code of the child process
shifted left by 8 bits.

Only the 7 or 8 lower bits of the return value of main() end up in the exit
code of the process.

The lower 8 bits of something shifted 8 bits to the left: do the math.
vallor
2024-02-01 04:55:24 UTC
Permalink
Post by Nicolas George
Post by Lew Pitcher
Mostly. The only exception looks to be very rare
Only exception?
Upon normal completion, system() returns the exit code of the child process
shifted left by 8 bits.
Only the 7 or 8 lower bits of the return value of main() end up in the exit
code of the process.
The lower 8 bits of something shifted 8 bits to the left: do the math.
That's not what my man page says:

* If all system calls succeed, then the return value is
the termination status of the child shell used to ex‐
ecute command. (The termination status of a shell is
the termination status of the last command it exe‐
cutes.)
[...]
CONFORMING TO
POSIX.1-2001, POSIX.1-2008, C89, C99.
[...]

But it wasn't a serious program anyway, it was a silly little program whose
sole job was to make itself.

Why? Because Bart posted a "C" source file that used nothing
but his own pragmas to build a project.

(Hey: if he want's to re-invent the wheel, it shouldn't start
off square...)

Oh, and sidenote: I "researched" (asked ChatGPT) about
make(1) being POSIX, and it says it isn't. However,
make(1) IS part of the SUS.
--
-Scott
Lawrence D'Oliveiro
2024-02-01 05:44:47 UTC
Permalink
Oh, and sidenote: I "researched" (asked ChatGPT) about make(1) being
POSIX, and it says it isn't.
POSIX Issue 7
<http://www.open-std.org/jtc1/sc22/open/n4217.pdf>, page 2830 onwards.
vallor
2024-02-01 05:54:13 UTC
Permalink
On Thu, 1 Feb 2024 05:44:47 -0000 (UTC), Lawrence D'Oliveiro
Post by Lawrence D'Oliveiro
Oh, and sidenote: I "researched" (asked ChatGPT) about make(1) being
POSIX, and it says it isn't.
POSIX Issue 7
<http://www.open-std.org/jtc1/sc22/open/n4217.pdf>, page 2830 onwards.
Well, I stand corrected!

That's not the first time that Beast has been wrong. I think maybe
it makes it up as it goes along...

And thank you for the pointer. :)
--
-v
Nicolas George
2024-02-01 08:10:36 UTC
Permalink
Post by vallor
* If all system calls succeed, then the return value is
the termination status of the child shell used to ex‐
ecute command. (The termination status of a shell is
the termination status of the last command it exe‐
cutes.)
That IS what the man page says, you just read it wrong. Thankfully, Single
Unix decided to avoid the pitfall by making it more explicit:

“If command is not a null pointer, system() shall return the termination
status of the command language interpreter in the format specified by
waitpid().”

But you did not have to rely on badly worded sentences in English, you could
have just tested for yourself. You should have tested for yourself.
Post by vallor
Oh, and sidenote: I "researched" (asked ChatGPT) about
Asking ChatGPT about technical matters is a sure way of wasting your time
and ours.
vallor
2024-02-01 15:53:43 UTC
Permalink
Post by Nicolas George
Post by vallor
* If all system calls succeed, then the return value is
the termination status of the child shell used to ex‐
ecute command. (The termination status of a shell is the
termination status of the last command it exe‐
cutes.)
That IS what the man page says, you just read it wrong. Thankfully,
“If command is not a null pointer, system() shall return the termination
status of the command language interpreter in the format specified by
waitpid().”
But you did not have to rely on badly worded sentences in English, you
could have just tested for yourself. You should have tested for
yourself.
You are correct, and next time I will.

I think I'll slink off and hide now...
--
-Scott
Scott Lurndal
2024-02-01 14:37:39 UTC
Permalink
Post by vallor
Post by Nicolas George
Post by Lew Pitcher
Mostly. The only exception looks to be very rare
Only exception?
Upon normal completion, system() returns the exit code of the child process
shifted left by 8 bits.
Only the 7 or 8 lower bits of the return value of main() end up in the exit
code of the process.
The lower 8 bits of something shifted 8 bits to the left: do the math.
* If all system calls succeed, then the return value is
the termination status of the child shell used to ex‐
ecute command. (The termination status of a shell is
the termination status of the last command it exe‐
cutes.)
[...]
CONFORMING TO
POSIX.1-2001, POSIX.1-2008, C89, C99.
[...]
But it wasn't a serious program anyway, it was a silly little program whose
sole job was to make itself.
Why? Because Bart posted a "C" source file that used nothing
but his own pragmas to build a project.
(Hey: if he want's to re-invent the wheel, it shouldn't start
off square...)
Oh, and sidenote: I "researched" (asked ChatGPT) about
make(1) being POSIX, and it says it isn't. However,
make(1) IS part of the SUS.
In this modern era, the POSIX and SuS are synonymous.
Kaz Kylheku
2024-02-01 16:23:41 UTC
Permalink
Post by vallor
Oh, and sidenote: I "researched" (asked ChatGPT) about
make(1) being POSIX, and it says it isn't. However,
make(1) IS part of the SUS.
"POSIX" and "Single Unix Specification" are one entity.
It's beeen that way for years now.
--
TXR Programming Language: http://nongnu.org/txr
Cygnal: Cygwin Native Application Library: http://kylheku.com/cygnal
Mastodon: @***@mstdn.ca
Geoff Clare
2024-02-02 13:50:57 UTC
Permalink
Post by Kaz Kylheku
Post by vallor
Oh, and sidenote: I "researched" (asked ChatGPT) about
make(1) being POSIX, and it says it isn't. However,
make(1) IS part of the SUS.
ChatGPT's answer was wrong. (Which is often the case.)
Post by Kaz Kylheku
"POSIX" and "Single Unix Specification" are one entity.
It's beeen that way for years now.
Assuming you are being lazy and saying "POSIX" when you really
mean "POSIX.1", that's true (since 2001 to be precise) for the
"base volumes" of SUS. (There is also an XCurses volume in SUS
that isn't in POSIX.1.)

But make was in "POSIX" before that too: in the POSIX.2 shell and
utilities standard, first published in 1992.
--
Geoff Clare <***@gclare.org.uk>
vallor
2024-01-31 19:01:01 UTC
Permalink
Post by vallor
On Tue, 30 Jan 2024 19:22:00 +0000, Richard Harnden
But I'm wondering about one file which contains all the sources for the
program. Like an IDE project file but lighter weight.
In other words: a Makefile
Agreed; it's a solution looking for a problem.
$ make -j # how does Bart's new build manager handle this case?
("-j" engages parallel compilation.)
$ cat try.c
#include <stdlib.h>
int main(void) {
return(system("make -j 16"));
}
_ _ _ _ _ _ _
$ cat Makefile
CFLAGS=-g -O2 -std=c90 -pedantic
_ _ _ _ _ _ _
$ make try
cc -g -O2 -std=c90 -pedantic try.c -o try
$ ./try
make: 'try' is up to date.
I also had "try:" in my Makefile.

_ _ _ _ _ _ _
CFLAGS=-g -O2 -std=c90 -pedantic

try:
_ _ _ _ _ _ _

But I changed the source to make it
explicitely:

$ cat try.c
#include <stdlib.h>
int main(void) {
return(system("make -j 16 try"));
}

$ ./try
cc -g -O2 -std=c90 -pedantic try.c -o try

$ ./try
make: 'try' is up to date.

(Beats trying to learn COBOL to keep up with
comp.lang.c... ;)
--
-v
bart
2024-01-31 20:25:07 UTC
Permalink
Post by vallor
On Tue, 30 Jan 2024 19:22:00 +0000, Richard Harnden
But I'm wondering about one file which contains all the sources for the
program. Like an IDE project file but lighter weight.
In other words: a Makefile
Agreed; it's a solution looking for a problem.
Why do you think languages come with modules? That allows them to
discover their own modules, rather than rely on external apps where the
details are buried under appalling syntax and mixed up with a hundred
other matters.
Post by vallor
$ make -j # how does Bart's new build manager handle this case?
("-j" engages parallel compilation.)
$ cat try.c
#include <stdlib.h>
int main(void) {
return(system("make -j 16"));
}
_ _ _ _ _ _ _
$ cat Makefile
CFLAGS=-g -O2 -std=c90 -pedantic
_ _ _ _ _ _ _
$ make try
cc -g -O2 -std=c90 -pedantic try.c -o try
$ ./try
make: 'try' is up to date.
This on the other hand looks EXACTLY like a solution looking a problem.


BTW that 'make' only works on my machine because it happens to be part
of mingw; none of my other C compilers have make.

And as written, it only works for 'cc' which comes with 'gcc'. If I use
CC to set another compiler, then the -o option is wrong for tcc. The
other options are not recognised with two other compilers.

Look at the follow-up to my OP that I will shortly post.
David Brown
2024-02-01 08:39:15 UTC
Permalink
Post by bart
Post by vallor
On Tue, 30 Jan 2024 19:22:00 +0000, Richard Harnden
But I'm wondering about one file which contains all the sources for the
program. Like an IDE project file but lighter weight.
In other words: a Makefile
Agreed; it's a solution looking for a problem.
Why do you think languages come with modules? That allows them to
discover their own modules, rather than rely on external apps where the
details are buried under appalling syntax and mixed up with a hundred
other matters.
No, that is not at all the purpose of modules in programming. Note that
there is no specific meaning of "module", and different languages use
different for similar concepts. There are many features that a
language's "module" system might have - some have all, some have few:

1. It lets you split the program into separate parts - generally
separate files. This is essential for scalability for large programs.

2. You can compile modules independently to allow partial builds.

3. Modules generally have some way to specify exported symbols and
facilities that can be used by other modules.

4. Modules can "import" other modules, gaining access to those modules'
exported symbols.

5. Modules provide encapsulation of data, code and namespaces.

6. Modules can be used in a hierarchical system, building big modules
from smaller ones to support larger libraries with many files.

7. Modules provide a higher level concept that can be used by language
tools to see how the whole program fits together or interact with
package managers and librarian tools.


C provides 1, 2, 3, and 4 if you use a "file.c/file.h" organisation. It
provides a limited form of 5 (everything that is not exported is
"static"), but scaling to larger systems is dependent on identifier
prefixes.

You seem to be thinking purely about item 7 above. This is, I think,
common in interpreted languages (where modules have to be found at
run-time, where the user is there but the developer is not). Compiled
languages don't usually have such a thing, because developers (as
distinct from users) have build tools available that do a better job.
bart
2024-02-01 11:31:14 UTC
Permalink
Post by bart
Post by vallor
On Tue, 30 Jan 2024 19:22:00 +0000, Richard Harnden
But I'm wondering about one file which contains all the sources for the
program. Like an IDE project file but lighter weight.
In other words: a Makefile
Agreed; it's a solution looking for a problem.
Why do you think languages come with modules? That allows them to
discover their own modules, rather than rely on external apps where
the details are buried under appalling syntax and mixed up with a
hundred other matters.
No, that is not at all the purpose of modules in programming.  Note that
there is no specific meaning of "module", and different languages use
different for similar concepts.  There are many features that a
1. It lets you split the program into separate parts - generally
separate files.  This is essential for scalability for large programs.
2. You can compile modules independently to allow partial builds.
3. Modules generally have some way to specify exported symbols and
facilities that can be used by other modules.
4. Modules can "import" other modules, gaining access to those modules'
exported symbols.
5. Modules provide encapsulation of data, code and namespaces.
6. Modules can be used in a hierarchical system, building big modules
from smaller ones to support larger libraries with many files.
7. Modules provide a higher level concept that can be used by language
tools to see how the whole program fits together or interact with
package managers and librarian tools.
C provides 1, 2, 3, and 4 if you use a "file.c/file.h" organisation.  It
provides a limited form of 5 (everything that is not exported is
"static"), but scaling to larger systems is dependent on identifier
prefixes.
You seem to be thinking purely about item 7 above.  This is, I think,
common in interpreted languages (where modules have to be found at
run-time, where the user is there but the developer is not).
I've been implementing languages with language-supported modules for
about 12 years.

They generally provide 1, 2, 4, 5, and 7 from your list, and partial
support of 6.

They don't provide 2 (compiling individual modules) because the aim is a
very fast, whole-program compler.

While for 6, there is only a hierarchy between groups of modules, each
forming an independent sub-program or library. I tried a strict full
per-module hierarchy early on, mixed up with independent compilation; it
worked poorly.

The two levels allow you to assemble one binary out of groups of modules
that each represent an independent component or library.
Compiled
languages don't usually have such a thing, because developers (as
distinct from users) have build tools available that do a better job.
Given a module scheme, the tool needed to build a whole program should
not need to be told about the names and location of every constituent
module; it should be able to determine that from what's already in the
source code, given only a start point.

Even with independent compilation, you might be able to use that info to
determine dependencies, but you will need that module hierarchy if you
want to compile individual modules.

My view is that that tool only needs to be the compiler (a program that
does the 'full stack' from source files to executable binary) working
purely from the source code.

Yours is to have compilers, assemblers, linkers and make programs,
working with auxiliary data in makefiles, that itself have to be
generated by extra tools or special options, or built by hand.

I see that as old-fashioned and error-prone. Also complex and limited
(eg. they will not support my compiler.)

The experiment in my OP is intended to bring part of my module scheme to C.

However, that will of course be poorly received. Why? Because when a
language doesn't provide a certain feature (eg. module management), then
people are free to do all sorts of wild and whacky things to achieve
some result.

Approaches that don't fit in to the disciplined requirements of a
language-stipulated module scheme.

A good example is the header-based module scheme of my BCC compiler;
this required modules to be implemented as tidy .h/.c pairs of files. Of
course, real C code is totally chaotic in its use of headers.

In other words, you can're retro-fit a real module-scheme to C, not one
that will work with existing code.

But for all my projects and all the ones /I/ want to build, they do come
down to just knowing what source files need to be submitted to the
compiler. It really can be that simple. That CAN be trivially
retro-fitted to existing projects.
David Brown
2024-02-01 15:11:50 UTC
Permalink
Post by bart
Post by bart
Post by vallor
On Tue, 30 Jan 2024 19:22:00 +0000, Richard Harnden
But I'm wondering about one file which contains all the sources for the
program. Like an IDE project file but lighter weight.
In other words: a Makefile
Agreed; it's a solution looking for a problem.
Why do you think languages come with modules? That allows them to
discover their own modules, rather than rely on external apps where
the details are buried under appalling syntax and mixed up with a
hundred other matters.
No, that is not at all the purpose of modules in programming.  Note
that there is no specific meaning of "module", and different languages
use different for similar concepts.  There are many features that a
1. It lets you split the program into separate parts - generally
separate files.  This is essential for scalability for large programs.
2. You can compile modules independently to allow partial builds.
3. Modules generally have some way to specify exported symbols and
facilities that can be used by other modules.
4. Modules can "import" other modules, gaining access to those
modules' exported symbols.
5. Modules provide encapsulation of data, code and namespaces.
6. Modules can be used in a hierarchical system, building big modules
from smaller ones to support larger libraries with many files.
7. Modules provide a higher level concept that can be used by language
tools to see how the whole program fits together or interact with
package managers and librarian tools.
C provides 1, 2, 3, and 4 if you use a "file.c/file.h" organisation.
It provides a limited form of 5 (everything that is not exported is
"static"), but scaling to larger systems is dependent on identifier
prefixes.
You seem to be thinking purely about item 7 above.  This is, I think,
common in interpreted languages (where modules have to be found at
run-time, where the user is there but the developer is not).
I've been implementing languages with language-supported modules for
about 12 years.
They generally provide 1, 2, 4, 5, and 7 from your list, and partial
support of 6.
Sure. Programming languages need that if they are to scale at all.
Post by bart
They don't provide 2 (compiling individual modules) because the aim is a
very fast, whole-program compler.
Okay.


But what you are talking about to add to C is item 7, nothing more.
That is not adding "modules" to C. Your suggestion might be useful to
some people for some projects, but that doesn't make it "modules" in any
real sense.
Post by bart
While for 6, there is only a hierarchy between groups of modules, each
forming an independent sub-program or library. I tried a strict full
per-module hierarchy early on, mixed up with independent compilation; it
worked poorly.
The two levels allow you to assemble one binary out of groups of modules
that each represent an independent component or library.
Compiled
languages don't usually have such a thing, because developers (as
distinct from users) have build tools available that do a better job.
Given a module scheme, the tool needed to build a whole program should
not need to be told about the names and location of every constituent
module; it should be able to determine that from what's already in the
source code, given only a start point.
Why?

You can't just take some idea that you like, and that is suitable for
the projects you use, and assume it applies to everyone else.

I have no problem telling my build system, or compilers, where the files
are. In fact, I'd have a lot of problems if I couldn't do that. It is
not normal development practice to have the source files in the same
directory that you use for building the object code and binaries.
Post by bart
Even with independent compilation, you might be able to use that info to
determine dependencies, but you will need that module hierarchy if you
want to compile individual modules.
I already have tools for determining dependencies. What can your
methods do that mine can't?

(And don't bother saying that you can do it without extra tools -
everyone who wants "make" and "gcc" has them on hand. And those who
want an IDE that figures out dependencies for them have a dozen free
options there too. These are all standard tools available to everyone.)
Post by bart
My view is that that tool only needs to be the compiler (a program that
does the 'full stack' from source files to executable binary) working
purely from the source code.
Yours is to have compilers, assemblers, linkers and make programs,
working with auxiliary data in makefiles, that itself have to be
generated by extra tools or special options, or built by hand.
You want a limited little built-in tool. I want a toolbox that I can
use in all sorts of ways - for things you have never imagined. I can
see how your personal tools can be useful for you, as a single developer
on your own - if you want something else you can add it to those tools.
For others, they are useless.

Perhaps I would find your tools worked for a "Hello, world" project.
Maybe they were still okay as it got slightly bigger. Then I'd have
something that they could not handle, and I'd reach for make. What
would be the point of using "make" to automate - for example -
post-processing of a binary to add a CRC check, but using your tools to
handle the build? It's much easier just to use "make" for the whole thing.

You are offering me a fish. I am offering to teach you to fish,
including where to go to catch different kinds of fish. This is really
a no-brainer choice.
Post by bart
In other words, you can're retro-fit a real module-scheme to C, not one
that will work with existing code.
We know that. Otherwise it would have happened, long ago.
Malcolm McLean
2024-02-01 17:33:46 UTC
Permalink
Post by bart
Post by bart
Post by vallor
On Tue, 30 Jan 2024 19:22:00 +0000, Richard Harnden
But I'm wondering about one file which contains all the sources for the
program. Like an IDE project file but lighter weight.
In other words: a Makefile
Agreed; it's a solution looking for a problem.
Why do you think languages come with modules? That allows them to
discover their own modules, rather than rely on external apps where
the details are buried under appalling syntax and mixed up with a
hundred other matters.
No, that is not at all the purpose of modules in programming.  Note
that there is no specific meaning of "module", and different
languages use different for similar concepts.  There are many
features that a language's "module" system might have - some have
1. It lets you split the program into separate parts - generally
separate files.  This is essential for scalability for large programs.
2. You can compile modules independently to allow partial builds.
3. Modules generally have some way to specify exported symbols and
facilities that can be used by other modules.
4. Modules can "import" other modules, gaining access to those
modules' exported symbols.
5. Modules provide encapsulation of data, code and namespaces.
6. Modules can be used in a hierarchical system, building big modules
from smaller ones to support larger libraries with many files.
7. Modules provide a higher level concept that can be used by
language tools to see how the whole program fits together or interact
with package managers and librarian tools.
C provides 1, 2, 3, and 4 if you use a "file.c/file.h" organisation.
It provides a limited form of 5 (everything that is not exported is
"static"), but scaling to larger systems is dependent on identifier
prefixes.
You seem to be thinking purely about item 7 above.  This is, I think,
common in interpreted languages (where modules have to be found at
run-time, where the user is there but the developer is not).
I've been implementing languages with language-supported modules for
about 12 years.
They generally provide 1, 2, 4, 5, and 7 from your list, and partial
support of 6.
Sure.  Programming languages need that if they are to scale at all.
Post by bart
They don't provide 2 (compiling individual modules) because the aim is
a very fast, whole-program compler.
Okay.
But what you are talking about to add to C is item 7, nothing more. That
is not adding "modules" to C.  Your suggestion might be useful to some
people for some projects, but that doesn't make it "modules" in any real
sense.
Post by bart
While for 6, there is only a hierarchy between groups of modules, each
forming an independent sub-program or library. I tried a strict full
per-module hierarchy early on, mixed up with independent compilation;
it worked poorly.
The two levels allow you to assemble one binary out of groups of
modules that each represent an independent component or library.
 > Compiled
 > languages don't usually have such a thing, because developers (as
 > distinct from users) have build tools available that do a better job.
Given a module scheme, the tool needed to build a whole program should
not need to be told about the names and location of every constituent
module; it should be able to determine that from what's already in the
source code, given only a start point.
Why?
You can't just take some idea that you like, and that is suitable for
the projects you use, and assume it applies to everyone else.
I have no problem telling my build system, or compilers, where the files
are.  In fact, I'd have a lot of problems if I couldn't do that.  It is
not normal development practice to have the source files in the same
directory that you use for building the object code and binaries.
Our system is that we've got two types of source generated by us, the
libraries which are used by all the programs, and the code specific to
each program. The library source code is placed on a central server and
then downloaded by conan (a package manager) which keeps it in a private
directory in the local machine not intended for viewing. The source
specific to the program is placed in a git project and synchronised with
git's remote repository facilities. Then IDE project files are built
with CMake. These with various other derived bits and bobs are placed in
a build folder, which is always under the git repository, but placed in
the ignore file and so not under git source control. The IDE is then
invoked on the project file in the build directory, and the executables
also go into the build directory. They then need to be moved to a
different location to be run.
CMake is set up so that it recursively crawls the source directories and
places every single source file into the IDE project file. This isn't
really recommended but it means you don't have to maintain CMakeLists files.
So it's an out of tree build. But we can't just place source in some
random location on the local machine and tell the system to pull it in.
Technically you could modify the CMake script to do that. But it would
break the whole system.
--
Check out Basic Algorithms and my other books:
https://www.lulu.com/spotlight/bgy1mm
bart
2024-02-01 18:34:08 UTC
Permalink
Post by bart
Post by David Brown
1. It lets you split the program into separate parts - generally
separate files.  This is essential for scalability for large programs.
2. You can compile modules independently to allow partial builds.
3. Modules generally have some way to specify exported symbols and
facilities that can be used by other modules.
4. Modules can "import" other modules, gaining access to those
modules' exported symbols.
5. Modules provide encapsulation of data, code and namespaces.
6. Modules can be used in a hierarchical system, building big modules
from smaller ones to support larger libraries with many files.
7. Modules provide a higher level concept that can be used by
language tools to see how the whole program fits together or interact
with package managers and librarian tools.
C provides 1, 2, 3, and 4 if you use a "file.c/file.h" organisation.
It provides a limited form of 5 (everything that is not exported is
"static"), but scaling to larger systems is dependent on identifier
prefixes.
You seem to be thinking purely about item 7 above.  This is, I think,
common in interpreted languages (where modules have to be found at
run-time, where the user is there but the developer is not).
I've been implementing languages with language-supported modules for
about 12 years.
They generally provide 1, 2, 4, 5, and 7 from your list, and partial
support of 6.
Sure.  Programming languages need that if they are to scale at all.
Post by bart
They don't provide 2 (compiling individual modules) because the aim is
a very fast, whole-program compler.
Okay.
But what you are talking about to add to C is item 7, nothing more. That
is not adding "modules" to C.  Your suggestion might be useful to some
people for some projects, but that doesn't make it "modules" in any real
sense.
Item 7 is my biggest stumbling to building open source C projects.

While the developer (say you), knows the necessary info, and can somehow
import into the build system, my job is trying to get it out.

I can't use the intended build system because for one reason or another
it doesn't work, or requires complex dependencies (MSYS, CMake, MSTOOLS,
.configure), or I want to run mcc on it.

That info could trivially be added to the C source code. Nobody actually
needs to use my #pragma scheme; it could simply be a block comment on
one of the modules.

I'm sure with all your complicated tools, they could surely dump some
text that looks like:

// List of source files to build the binary cipher.c:
// cipher.c
// hmac.c
// sha2.c

and prepend it to one of the files. Even a README will do.

That wouldn't hurt would it?
Post by bart
Given a module scheme, the tool needed to build a whole program should
not need to be told about the names and location of every constituent
module; it should be able to determine that from what's already in the
source code, given only a start point.
Why?
You can't just take some idea that you like, and that is suitable for
the projects you use, and assume it applies to everyone else.
I have no problem telling my build system, or compilers, where the files
are.  In fact, I'd have a lot of problems if I couldn't do that.  It is
not normal development practice to have the source files in the same
directory that you use for building the object code and binaries.
Post by bart
Even with independent compilation, you might be able to use that info
to determine dependencies, but you will need that module hierarchy if
you want to compile individual modules.
I already have tools for determining dependencies.  What can your
methods do that mine can't?
(And don't bother saying that you can do it without extra tools -
everyone who wants "make" and "gcc" has them on hand.  And those who
want an IDE that figures out dependencies for them have a dozen free
options there too.  These are all standard tools available to everyone.)
So, if C were to acquire modules, so that a C compiler could determine
that all for it itself (maybe even work out for itself which need
recompiling), would you just ignore that feature and use the same
auxiliary methods you have always done?

You don't see that the language taking over task (1) of the things that
makefiles do, and possibly (2) (of the list I posted; repeated below),
can streamline makefiles to make them shorter, simpler, easier to write
and to read, and with fewer opportunities to get stuff wrong?

That was a rhetorical question. Obviously not.
Perhaps I would find your tools worked for a "Hello, world" project.
Maybe they were still okay as it got slightly bigger.  Then I'd have
something that they could not handle, and I'd reach for make.  What
would be the point of using "make" to automate - for example -
post-processing of a binary to add a CRC check, but using your tools to
handle the build?  It's much easier just to use "make" for the whole thing.
Because building one binary is a process should be the job of a
compiler, not some random external tool that knows nothing of the
language or compiler.

Maybe you think makefiles should individually list all the 1000s of
functions of a project too?
You are offering me a fish.  I am offering to teach you to fish,
including where to go to catch different kinds of fish.  This is really
a no-brainer choice.
That analogy makes no sense.

Let me try and explain what I do: I write whole-program compilers. That
means that, each time you do a new build, it will reprocess each file
from source. They use the language's module scheme to know which files
to process.

I tend to build C programs by recompiling all modules too. So I want to
introduce the same convenience I have elsewhere.

It works for me, and I'm sure could work for others if they didn't have
makefiles forced down their throats and hardwired into their brains.

----------------------------
(Repost)

I've already covered this in many posts on the subject. But 'make' deals
with three kinds of requirements:

(1) Specifying what the modules are to be compiled and combined into one
binary file

(2) Specifying dependences between all files to allow rebuilding of that
one file with minimal recompilation

(3) Everything else needed in a complex project: running processes to
generate files file config.h, creating multiple binaries, specifying
dependencies between binaries, installation etc

My proposal tackles only (1), which is something that many languages now
have the means to deal with themselves. I already stated that (2) is not
covered.

But you may still need makefiles to deal with (3).

If your main requirement /is/ only (1), then my idea is to move the
necessary info into the source code, and tackle it with the C compiler.
Michael S
2024-02-01 20:23:28 UTC
Permalink
On Thu, 1 Feb 2024 18:34:08 +0000
Post by bart
Post by bart
Post by David Brown
1. It lets you split the program into separate parts - generally
separate files.  This is essential for scalability for large programs.
2. You can compile modules independently to allow partial builds.
3. Modules generally have some way to specify exported symbols
and facilities that can be used by other modules.
4. Modules can "import" other modules, gaining access to those
modules' exported symbols.
5. Modules provide encapsulation of data, code and namespaces.
6. Modules can be used in a hierarchical system, building big
modules from smaller ones to support larger libraries with many
files.
7. Modules provide a higher level concept that can be used by
language tools to see how the whole program fits together or
interact with package managers and librarian tools.
C provides 1, 2, 3, and 4 if you use a "file.c/file.h"
organisation. It provides a limited form of 5 (everything that is
not exported is "static"), but scaling to larger systems is
dependent on identifier prefixes.
You seem to be thinking purely about item 7 above.  This is, I
think, common in interpreted languages (where modules have to be
found at run-time, where the user is there but the developer is
not).
I've been implementing languages with language-supported modules
for about 12 years.
They generally provide 1, 2, 4, 5, and 7 from your list, and
partial support of 6.
Sure.  Programming languages need that if they are to scale at all.
Post by bart
They don't provide 2 (compiling individual modules) because the
aim is a very fast, whole-program compler.
Okay.
But what you are talking about to add to C is item 7, nothing more.
That is not adding "modules" to C.  Your suggestion might be useful
to some people for some projects, but that doesn't make it
"modules" in any real sense.
Item 7 is my biggest stumbling to building open source C projects.
While the developer (say you), knows the necessary info, and can
somehow import into the build system, my job is trying to get it out.
I can't use the intended build system because for one reason or
another it doesn't work, or requires complex dependencies (MSYS,
CMake, MSTOOLS, .configure), or I want to run mcc on it.
That info could trivially be added to the C source code. Nobody
actually needs to use my #pragma scheme; it could simply be a block
comment on one of the modules.
I'm sure with all your complicated tools, they could surely dump some
// cipher.c
// hmac.c
// sha2.c
and prepend it to one of the files. Even a README will do.
That wouldn't hurt would it?
Post by bart
Given a module scheme, the tool needed to build a whole program
should not need to be told about the names and location of every
constituent module; it should be able to determine that from
what's already in the source code, given only a start point.
Why?
You can't just take some idea that you like, and that is suitable
for the projects you use, and assume it applies to everyone else.
I have no problem telling my build system, or compilers, where the
files are.  In fact, I'd have a lot of problems if I couldn't do
that.  It is not normal development practice to have the source
files in the same directory that you use for building the object
code and binaries.
Post by bart
Even with independent compilation, you might be able to use that
info to determine dependencies, but you will need that module
hierarchy if you want to compile individual modules.
I already have tools for determining dependencies.  What can your
methods do that mine can't?
(And don't bother saying that you can do it without extra tools -
everyone who wants "make" and "gcc" has them on hand.  And those
who want an IDE that figures out dependencies for them have a dozen
free options there too.  These are all standard tools available to
everyone.)
So, if C were to acquire modules, so that a C compiler could
determine that all for it itself (maybe even work out for itself
which need recompiling), would you just ignore that feature and use
the same auxiliary methods you have always done?
You don't see that the language taking over task (1) of the things
that makefiles do, and possibly (2) (of the list I posted; repeated
below), can streamline makefiles to make them shorter, simpler,
easier to write and to read, and with fewer opportunities to get
stuff wrong?
That was a rhetorical question. Obviously not.
Perhaps I would find your tools worked for a "Hello, world"
project. Maybe they were still okay as it got slightly bigger.
Then I'd have something that they could not handle, and I'd reach
for make.  What would be the point of using "make" to automate -
for example - post-processing of a binary to add a CRC check, but
using your tools to handle the build?  It's much easier just to use
"make" for the whole thing.
Because building one binary is a process should be the job of a
compiler, not some random external tool that knows nothing of the
language or compiler.
Maybe you think makefiles should individually list all the 1000s of
functions of a project too?
You are offering me a fish.  I am offering to teach you to fish,
including where to go to catch different kinds of fish.  This is
really a no-brainer choice.
That analogy makes no sense.
Let me try and explain what I do: I write whole-program compilers.
That means that, each time you do a new build, it will reprocess each
file from source. They use the language's module scheme to know which
files to process.
I tend to build C programs by recompiling all modules too. So I want
to introduce the same convenience I have elsewhere.
It works for me, and I'm sure could work for others if they didn't
have makefiles forced down their throats and hardwired into their
brains.
----------------------------
(Repost)
I've already covered this in many posts on the subject. But 'make'
(1) Specifying what the modules are to be compiled and combined into
one binary file
(2) Specifying dependences between all files to allow rebuilding of
that one file with minimal recompilation
(3) Everything else needed in a complex project: running processes to
generate files file config.h, creating multiple binaries,
specifying dependencies between binaries, installation etc
My proposal tackles only (1), which is something that many languages
now have the means to deal with themselves. I already stated that (2)
is not covered.
But you may still need makefiles to deal with (3).
If your main requirement /is/ only (1), then my idea is to move the
necessary info into the source code, and tackle it with the C
compiler.
You proposal and needs of David Brown are not necessarily
contradictory.
All you need to do to satisfy him is to add to your compiler an option
for export of dependencies in make-compatible format, i.e. something
very similar to -MD option of gcc.

Then David could write in his makefile:

out/foo.elf : main_foo.c
mcc -MD $< -o $@

-include out/foo.d

And then to proceed with automatiion of his pre and post-processing needs.
Scott Lurndal
2024-02-01 20:55:53 UTC
Permalink
Post by Michael S
On Thu, 1 Feb 2024 18:34:08 +0000
Post by bart
But you may still need makefiles to deal with (3).
=20
If your main requirement /is/ only (1), then my idea is to move the=20
necessary info into the source code, and tackle it with the C
compiler.
=20
You proposal and needs of David Brown are not necessarily
contradictory.=20
Although David (and I) aren't particularly interested in
changing something that already works quite well.
Post by Michael S
All you need to do to satisfy him is to add to your compiler an option
for export of dependencies in make-compatible format, i.e. something
very similar to -MD option of gcc.
I suspect he may be much more difficult to satisfy on this topic.

Nobody is going to switch production software to a one-off
unsupported compiler.
Chris M. Thomasson
2024-02-01 21:10:14 UTC
Permalink
Post by Scott Lurndal
Post by Michael S
On Thu, 1 Feb 2024 18:34:08 +0000
Post by bart
But you may still need makefiles to deal with (3).
=20
If your main requirement /is/ only (1), then my idea is to move the=20
necessary info into the source code, and tackle it with the C compiler.
=20
You proposal and needs of David Brown are not necessarily
contradictory.=20
Although David (and I) aren't particularly interested in
changing something that already works quite well.
Post by Michael S
All you need to do to satisfy him is to add to your compiler an option
for export of dependencies in make-compatible format, i.e. something
very similar to -MD option of gcc.
I suspect he may be much more difficult to satisfy on this topic.
Nobody is going to switch production software to a one-off
unsupported compiler.
No shit. Even then, he would have to test drive it, make sure it passes
all unit tests, ect... How fun... ;^)
David Brown
2024-02-01 21:38:13 UTC
Permalink
Post by Michael S
On Thu, 1 Feb 2024 18:34:08 +0000
Post by bart
I've already covered this in many posts on the subject. But 'make'
(1) Specifying what the modules are to be compiled and combined into
one binary file
(2) Specifying dependences between all files to allow rebuilding of
that one file with minimal recompilation
(3) Everything else needed in a complex project: running processes to
generate files file config.h, creating multiple binaries,
specifying dependencies between binaries, installation etc
My proposal tackles only (1), which is something that many languages
now have the means to deal with themselves. I already stated that (2)
is not covered.
But you may still need makefiles to deal with (3).
If your main requirement /is/ only (1), then my idea is to move the
necessary info into the source code, and tackle it with the C
compiler.
You proposal and needs of David Brown are not necessarily
contradictory.
All you need to do to satisfy him is to add to your compiler an option
for export of dependencies in make-compatible format, i.e. something
very similar to -MD option of gcc.
out/foo.elf : main_foo.c
-include out/foo.d
And then to proceed with automatiion of his pre and post-processing needs.
But then I'd still be using "make", and Bart would not be happy.

And "gcc -MD" does not need any extra #pragmas, so presumably neither
would an implementation of that feature in bcc (or mcc or whatever). So
Bart's new system would disappear entirely.
Michael S
2024-02-01 22:55:38 UTC
Permalink
On Thu, 1 Feb 2024 22:38:13 +0100
Post by David Brown
Post by Michael S
On Thu, 1 Feb 2024 18:34:08 +0000
Post by bart
I've already covered this in many posts on the subject. But 'make'
(1) Specifying what the modules are to be compiled and combined
into one binary file
(2) Specifying dependences between all files to allow rebuilding of
that one file with minimal recompilation
(3) Everything else needed in a complex project: running processes
to generate files file config.h, creating multiple binaries,
specifying dependencies between binaries, installation etc
My proposal tackles only (1), which is something that many
languages now have the means to deal with themselves. I already
stated that (2) is not covered.
But you may still need makefiles to deal with (3).
If your main requirement /is/ only (1), then my idea is to move the
necessary info into the source code, and tackle it with the C compiler.
You proposal and needs of David Brown are not necessarily
contradictory.
All you need to do to satisfy him is to add to your compiler an
option for export of dependencies in make-compatible format, i.e.
something very similar to -MD option of gcc.
out/foo.elf : main_foo.c
-include out/foo.d
And then to proceed with automatiion of his pre and post-processing needs.
But then I'd still be using "make", and Bart would not be happy.
And "gcc -MD" does not need any extra #pragmas, so presumably neither
would an implementation of that feature in bcc (or mcc or whatever).
So Bart's new system would disappear entirely.
Bart spares you from managing list(s) of objects in your makefile and
from writing arcan helper macros.
Yes, I know, you copy&past arcan macros from project to project, but
you had to write them n years ago and that surely was not easy.
Lawrence D'Oliveiro
2024-02-01 23:31:36 UTC
Permalink
Yes, I know, you copy&past arcan macros from project to project, but you
had to write them n years ago and that surely was not easy.
And maybe you discover bugs in them in certain situations, and have to
track down all the places you copied/pasted them and fix them.

My code-reuse OCD reflex is twitching at this point.
Scott Lurndal
2024-02-02 02:08:14 UTC
Permalink
Post by Michael S
On Thu, 1 Feb 2024 22:38:13 +0100
Post by David Brown
Post by Michael S
And then to proceed with automatiion of his pre and post-processing needs.
But then I'd still be using "make", and Bart would not be happy.
And "gcc -MD" does not need any extra #pragmas, so presumably neither
would an implementation of that feature in bcc (or mcc or whatever).
So Bart's new system would disappear entirely.
Bart spares you from managing list(s) of objects in your makefile and
from writing arcan helper macros.
Yes, I know, you copy&past arcan macros from project to project, but
you had to write them n years ago and that surely was not easy.
"Not easy for you" doesn't automatically translate to "not easy for
everyone else".

Difficult is the configuration file for sendmail processed by m4.

Make is easy.
David Brown
2024-02-02 08:02:15 UTC
Permalink
Post by Michael S
On Thu, 1 Feb 2024 22:38:13 +0100
Post by David Brown
Post by Michael S
On Thu, 1 Feb 2024 18:34:08 +0000
You proposal and needs of David Brown are not necessarily
contradictory.
All you need to do to satisfy him is to add to your compiler an
option for export of dependencies in make-compatible format, i.e.
something very similar to -MD option of gcc.
out/foo.elf : main_foo.c
-include out/foo.d
And then to proceed with automatiion of his pre and post-processing needs.
But then I'd still be using "make", and Bart would not be happy.
And "gcc -MD" does not need any extra #pragmas, so presumably neither
would an implementation of that feature in bcc (or mcc or whatever).
So Bart's new system would disappear entirely.
Bart spares you from managing list(s) of objects in your makefile and
from writing arcan helper macros.
Yes, I know, you copy&past arcan macros from project to project, but
you had to write them n years ago and that surely was not easy.
Google "makefile automatic dependencies", then adapt to suit your own
needs. Re-use the same makefile time and again.

Yes, some of the functions I have in my makefiles are a bit hairy, and
some of the command line options for gcc are a bit complicated. They
are done now.

If there had been an easier way than this, which still let me do what I
need (Bart's system does not), which is popular enough that you can
easily google for examples, blogs, and tutorials, then I'd have been
happy to use that at the time. I won't change to something else unless
it gives me significant additional benefits.

People smarter and more experienced than Bart have been trying to invent
better replacements for "make" for many decades. None have succeeded.
Some build systems are better in some ways, but nothing has come close
to covering the wide range of features and uses of make, or gaining hold
outside a particular niche. Everyone who has ever made serious use of
"make" knows it has many flaws, unnecessarily complications, limitations
and inefficiencies. Despite that, it is the best we have.

With Bart's limited knowledge and experience, and deeply ingrained
prejudices and misunderstandings, the best we can hope for is something
that works well enough for some simple cases of C programs. More
realistically, it will work for Bart's use alone.

And that, of course, is absolutely fine. No one is paying Bart to write
a generic build system, or something of use to anyone else. He is free
to write exactly what he wants, in the way he wants, and if ends up with
a tool that he finds useful himself, that is great. If he ends up with
something that at least some other people find useful, that is even
better, and I wish him luck with his work.

But don't hold your breath waiting for something that will replace make,
or attract users of any other build system.
Michael S
2024-02-02 13:28:49 UTC
Permalink
On Fri, 2 Feb 2024 09:02:15 +0100
Post by David Brown
But don't hold your breath waiting for something that will replace
make, or attract users of any other build system.
It seems, you already forgot the context of my post that started this
short sub-thread.

BTW, I would imagine that Stu Feldman, if he is still in good health,
would fine talking with Bart more entertaining that talking with you.
I think, you, English speakers, call it birds of feather.
David Brown
2024-02-02 14:49:20 UTC
Permalink
Post by Michael S
On Fri, 2 Feb 2024 09:02:15 +0100
Post by David Brown
But don't hold your breath waiting for something that will replace
make, or attract users of any other build system.
It seems, you already forgot the context of my post that started this
short sub-thread.
That is absolutely possible. It was not intentional, but the number of
posts in recent times has been overwhelming. I apologise if I have
misinterpreted what you wrote.
Post by Michael S
BTW, I would imagine that Stu Feldman, if he is still in good health,
would fine talking with Bart more entertaining that talking with you.
I have no idea who that is, so I'll take your word for it.
Post by Michael S
I think, you, English speakers, call it birds of feather.
Michael S
2024-02-02 14:53:35 UTC
Permalink
On Fri, 2 Feb 2024 15:49:20 +0100
Post by David Brown
Post by Michael S
On Fri, 2 Feb 2024 09:02:15 +0100
Post by David Brown
But don't hold your breath waiting for something that will replace
make, or attract users of any other build system.
It seems, you already forgot the context of my post that started
this short sub-thread.
That is absolutely possible. It was not intentional, but the number
of posts in recent times has been overwhelming. I apologise if I
have misinterpreted what you wrote.
Post by Michael S
BTW, I would imagine that Stu Feldman, if he is still in good
health, would fine talking with Bart more entertaining that talking
with you.
I have no idea who that is, so I'll take your word for it.
Inventor of make
Post by David Brown
Post by Michael S
I think, you, English speakers, call it birds of feather.
Kenny McCormack
2024-02-02 16:29:20 UTC
Permalink
In article <***@yahoo.com>,
Michael S <***@yahoo.com> wrote:
...
Post by Michael S
Post by David Brown
I have no idea who that is, so I'll take your word for it.
Inventor of make
At Bell Labs, in 1976.

Currently, like quite a few ex-Bell Labs people, a big wheel at Google.
--
Marshall: 10/22/51
Jessica: 4/4/79
Kaz Kylheku
2024-02-02 17:29:24 UTC
Permalink
Post by Kenny McCormack
...
Post by Michael S
Post by David Brown
I have no idea who that is, so I'll take your word for it.
Inventor of make
At Bell Labs, in 1976.
Currently, like quite a few ex-Bell Labs people, a big wheel at Google.
It's like a cargo cult. If we hire old Unix geezers and prop them up in
chairs, be they dead or alive, magic will happen.
--
TXR Programming Language: http://nongnu.org/txr
Cygnal: Cygwin Native Application Library: http://kylheku.com/cygnal
Mastodon: @***@mstdn.ca
bart
2024-02-02 13:47:25 UTC
Permalink
Post by David Brown
Post by Michael S
On Thu, 1 Feb 2024 22:38:13 +0100
Post by David Brown
Post by Michael S
On Thu, 1 Feb 2024 18:34:08 +0000
You proposal and needs of David Brown are not necessarily
contradictory.
All you need to do to satisfy him is to add to your compiler an
option for export of dependencies in make-compatible format, i.e.
something very similar to -MD option of gcc.
out/foo.elf : main_foo.c
-include out/foo.d
And then to proceed with automatiion of his pre and post-processing needs.
But then I'd still be using "make", and Bart would not be happy.
And "gcc -MD" does not need any extra #pragmas, so presumably neither
would an implementation of that feature in bcc (or mcc or whatever).
So Bart's new system would disappear entirely.
Bart spares you from managing list(s) of objects in your makefile and
from writing arcan helper macros.
Yes, I know, you copy&past arcan macros from project to project, but
you had to write them n years ago and that surely was not easy.
Google "makefile automatic dependencies", then adapt to suit your own
needs.  Re-use the same makefile time and again.
Yes, some of the functions I have in my makefiles are a bit hairy, and
some of the command line options for gcc are a bit complicated.  They
are done now.
If there had been an easier way than this, which still let me do what I
need (Bart's system does not), which is popular enough that you can
easily google for examples, blogs, and tutorials, then I'd have been
happy to use that at the time.  I won't change to something else unless
it gives me significant additional benefits.
People smarter and more experienced than Bart have been trying to invent
better replacements for "make" for many decades.  None have succeeded.
Some build systems are better in some ways, but nothing has come close
to covering the wide range of features and uses of make, or gaining hold
outside a particular niche.  Everyone who has ever made serious use of
"make" knows it has many flaws, unnecessarily complications, limitations
and inefficiencies.  Despite that, it is the best we have.
With Bart's limited knowledge and experience,
That's true: only 47 years in computing, and 42 years of evolving,
implementing and running my systems language.

What can I possibly know about compiling sources files of a lower-level
language into binaries?

How many assemblers, compilers, linkers, and interpreters have /you/
written?
Post by David Brown
and deeply ingrained
prejudices and misunderstandings, the best we can hope for is something
that works well enough for some simple cases of C programs.
With the proposal outlined in my OP, any of MY C programs, if I was to
write or port multi-module projects in that language, could be trivially
built by giving only the name of the compiler, and the name of one module.
Post by David Brown
  More
realistically, it will work for Bart's use alone.
It certainly won't for your stuff, or SL's, or JP's, or TR's, as you
all seem to delight in wheeling out the most complex scenarios you can find.

That is another aspect you might do well to learn how to do: KISS. (Yes
I can be a patronising fuck too.)
Post by David Brown
And that, of course, is absolutely fine.  No one is paying Bart to write
a generic build system, or something of use to anyone else.  He is free
to write exactly what he wants, in the way he wants, and if ends up with
a tool that he finds useful himself, that is great.  If he ends up with
something that at least some other people find useful, that is even
better, and I wish him luck with his work.
But don't hold your breath waiting for something that will replace make,
or attract users of any other build system.
Jesus. And you seem to determined to ignore everything I write, or have
a short memory.

I'm not suggesting replacing make, only to reduce its involvement.

Twice I posted a list of 3 things that make takes care of; I'm looking
at replacing just 1 of those things, the I which for me is more critical.
David Brown
2024-02-02 14:57:28 UTC
Permalink
Post by bart
Post by David Brown
Post by Michael S
On Thu, 1 Feb 2024 22:38:13 +0100
Post by David Brown
Post by Michael S
On Thu, 1 Feb 2024 18:34:08 +0000
You proposal and needs of David Brown are not necessarily
contradictory.
All you need to do to satisfy him is to add to your compiler an
option for export of dependencies in make-compatible format, i.e.
something very similar to -MD option of gcc.
out/foo.elf : main_foo.c
-include out/foo.d
And then to proceed with automatiion of his pre and post-processing needs.
But then I'd still be using "make", and Bart would not be happy.
And "gcc -MD" does not need any extra #pragmas, so presumably neither
would an implementation of that feature in bcc (or mcc or whatever).
So Bart's new system would disappear entirely.
Bart spares you from managing list(s) of objects in your makefile and
from writing arcan helper macros.
Yes, I know, you copy&past arcan macros from project to project, but
you had to write them n years ago and that surely was not easy.
Google "makefile automatic dependencies", then adapt to suit your own
needs.  Re-use the same makefile time and again.
Yes, some of the functions I have in my makefiles are a bit hairy, and
some of the command line options for gcc are a bit complicated.  They
are done now.
If there had been an easier way than this, which still let me do what
I need (Bart's system does not), which is popular enough that you can
easily google for examples, blogs, and tutorials, then I'd have been
happy to use that at the time.  I won't change to something else
unless it gives me significant additional benefits.
People smarter and more experienced than Bart have been trying to
invent better replacements for "make" for many decades.  None have
succeeded. Some build systems are better in some ways, but nothing has
come close to covering the wide range of features and uses of make, or
gaining hold outside a particular niche.  Everyone who has ever made
serious use of "make" knows it has many flaws, unnecessarily
complications, limitations and inefficiencies.  Despite that, it is
the best we have.
With Bart's limited knowledge and experience,
That's true: only 47 years in computing, and 42 years of evolving,
implementing and running my systems language.
Yes. Most of it using your languages, your tools, your programs, and
determinedly refusing to learn or use anything else more than the barest
minimum, and so completely convinced of your own superiority and the
failings of everyone else and all other languages and software that you
are unable to learn things properly or consider anything from a
viewpoint other than your own.

You have experience - but it is limited by the walls you put up around
yourself.
Post by bart
What can I possibly know about compiling sources files of a lower-level
language into binaries?
You know how /you/ do it, and how /you/ want to do it. You know sod all
about anyone else.
Post by bart
That is another aspect you might do well to learn how to do: KISS. (Yes
I can be a patronising fuck too.)
KISS is great. It's what encourages people to use existing standard
tools like "make" and "C", instead of trying to re-invent their own
personal wheels all the time. /Sometimes/ it is useful to re-invent
something from scratch. Most of the time, it is not.
Post by bart
Post by David Brown
And that, of course, is absolutely fine.  No one is paying Bart to
write a generic build system, or something of use to anyone else.  He
is free to write exactly what he wants, in the way he wants, and if
ends up with a tool that he finds useful himself, that is great.  If
he ends up with something that at least some other people find useful,
that is even better, and I wish him luck with his work.
But don't hold your breath waiting for something that will replace
make, or attract users of any other build system.
Jesus. And you seem to determined to ignore everything I write, or have
a short memory.
I'm not suggesting replacing make, only to reduce its involvement.
I didn't say you were trying to replace make, or even thought you were.
I said you were not replacing make. There's a difference.
Post by bart
Twice I posted a list of 3 things that make takes care of; I'm looking
at replacing just 1 of those things, the I which for me is more critical.
And I have repeatedly said that if you are making a tool that is useful
for you, then great - make your tool and use it.
Scott Lurndal
2024-02-02 15:18:11 UTC
Permalink
Post by bart
Post by David Brown
With Bart's limited knowledge and experience,
That's true: only 47 years in computing, and 42 years of evolving,
implementing and running my systems language.
It's pretty clear that you have very limited knowledge
and experience with unix, make and and pretty much
anything that isn't your soi disant compiler.
Post by bart
What can I possibly know about compiling sources files of a lower-level
language into binaries?
Very little, it appears, outside of your toy projects.
Post by bart
How many assemblers, compilers, linkers, and interpreters have /you/
written?
Can't speak for David, but in my case, at least one of each, and
you can add operating systems and hypervisors to that list.
Post by bart
It certainly won't for your stuff, or SL's, or JP's, or TR's, as you
all seem to delight in wheeling out the most complex scenarios you can find.
The "stuff" I write is for customers. Any so-called-bart-complexity is based on
customer requirements. The customers are quite happy with the solutions
they get.
Post by bart
That is another aspect you might do well to learn how to do: KISS. (Yes
I can be a patronising fuck too.)
KISS is a good principle to follow, and while I cannot again speak
for David, it's a principle followed by most programmers I've worked
with. That doesn't mean throwing away perfectly usable tools
(one can easily make KISS-compliant makefiles, for example).
Post by bart
I'm not suggesting replacing make, only to reduce its involvement.
And to reduce it's involvement, something must replace make. ipso facto.
bart
2024-02-02 17:44:26 UTC
Permalink
Post by Scott Lurndal
Post by bart
Post by David Brown
With Bart's limited knowledge and experience,
That's true: only 47 years in computing, and 42 years of evolving,
implementing and running my systems language.
It's pretty clear that you have very limited knowledge
and experience with unix, make and and pretty much
anything that isn't your soi disant compiler.
Yes. And?
Post by Scott Lurndal
Post by bart
What can I possibly know about compiling sources files of a lower-level
language into binaries?
Very little, it appears, outside of your toy projects.
That's right, I only have experience of the stuff I've done. And?

Most stuff I want to build is on a similar scale, so you'd probably
consider all that as toys too.

You're saying that anyone not using Unix, not building 10Mloc projects,
and not a fan of make, should FOAD?
Post by Scott Lurndal
Post by bart
How many assemblers, compilers, linkers, and interpreters have /you/
written?
OK. How do I know these aren't just toys, or is it only you who is
allowed to judge?

BTW what exactly is a toy project?
Post by Scott Lurndal
Can't speak for David, but in my case, at least one of each, and
you can add operating systems and hypervisors to that list.
I don't do OSes. If I did, you probably have a good idea of what mine
would look like!
Post by Scott Lurndal
Post by bart
It certainly won't for your stuff, or SL's, or JP's, or TR's, as you
all seem to delight in wheeling out the most complex scenarios you can find.
The "stuff" I write is for customers. Any so-called-bart-complexity is based on
customer requirements. The customers are quite happy with the solutions
they get.
Post by bart
That is another aspect you might do well to learn how to do: KISS. (Yes
I can be a patronising fuck too.)
KISS is a good principle to follow, and while I cannot again speak
for David, it's a principle followed by most programmers I've worked
with. That doesn't mean throwing away perfectly usable tools
(one can easily make KISS-compliant makefiles, for example).
Post by bart
I'm not suggesting replacing make, only to reduce its involvement.
And to reduce it's involvement, something must replace make. ipso facto.
No. I'm saying make should be less involved in specifying which files to
be submitted to a compiler-toolchain.

Especially for a makefile specifying a production or distribution build,
such as one done at a remote site by someone is not the developer, but
just wants a working binary.
Scott Lurndal
2024-02-02 18:26:53 UTC
Permalink
Post by bart
You're saying that anyone not using Unix, not building 10Mloc projects,
and not a fan of make, should FOAD?
No. I'm saying that your dislike of make is personal. If you
don't like it, don't use it. Make your own, nobody is stopping
you. Just don't brag about how "small", "easy" or "nice" it is
until it can handle the same job as make.
David Brown
2024-02-01 21:34:36 UTC
Permalink
Post by bart
Post by bart
Post by David Brown
1. It lets you split the program into separate parts - generally
separate files.  This is essential for scalability for large programs.
2. You can compile modules independently to allow partial builds.
3. Modules generally have some way to specify exported symbols and
facilities that can be used by other modules.
4. Modules can "import" other modules, gaining access to those
modules' exported symbols.
5. Modules provide encapsulation of data, code and namespaces.
6. Modules can be used in a hierarchical system, building big
modules from smaller ones to support larger libraries with many files.
7. Modules provide a higher level concept that can be used by
language tools to see how the whole program fits together or
interact with package managers and librarian tools.
C provides 1, 2, 3, and 4 if you use a "file.c/file.h" organisation.
It provides a limited form of 5 (everything that is not exported is
"static"), but scaling to larger systems is dependent on identifier
prefixes.
You seem to be thinking purely about item 7 above.  This is, I
think, common in interpreted languages (where modules have to be
found at run-time, where the user is there but the developer is not).
I've been implementing languages with language-supported modules for
about 12 years.
They generally provide 1, 2, 4, 5, and 7 from your list, and partial
support of 6.
Sure.  Programming languages need that if they are to scale at all.
Post by bart
They don't provide 2 (compiling individual modules) because the aim
is a very fast, whole-program compler.
Okay.
But what you are talking about to add to C is item 7, nothing more.
That is not adding "modules" to C.  Your suggestion might be useful to
some people for some projects, but that doesn't make it "modules" in
any real sense.
Item 7 is my biggest stumbling to building open source C projects.
While the developer (say you), knows the necessary info, and can somehow
import into the build system, my job is trying to get it out.
I can't use the intended build system because for one reason or another
it doesn't work, or requires complex dependencies (MSYS, CMake, MSTOOLS,
.configure), or I want to run mcc on it.
That info could trivially be added to the C source code. Nobody actually
needs to use my #pragma scheme; it could simply be a block comment on
one of the modules.
I'm sure with all your complicated tools, they could surely dump some
   // cipher.c
   // hmac.c
   // sha2.c
and prepend it to one of the files.  Even a README will do.
That wouldn't hurt would it?
Complain to the people that made that open source software, not me. But
don't be surprised if they tell you "There's a makefile. It works for
everyone else." Or maybe they will say they can't cater for every
little problem with everyone's unusual computer setup. Maybe they will
try to be helpful, maybe they will be rude and arrogant. Maybe they
will point out that their makefile /is/ just a list of the files needed,
along with the compiler options. Usually projects of any size /do/ have
readme's and build instructions - but some won't.

No matter what, it is not the fault of anyone here, it is not the fault
of "make" or Linux or C, and there is nothing that any of us can do to
help you. (And $DEITY knows, we have tried.)
Post by bart
I already have tools for determining dependencies.  What can your
methods do that mine can't?
(And don't bother saying that you can do it without extra tools -
everyone who wants "make" and "gcc" has them on hand.  And those who
want an IDE that figures out dependencies for them have a dozen free
options there too.  These are all standard tools available to everyone.)
So, if C were to acquire modules, so that a C compiler could determine
that all for it itself (maybe even work out for itself which need
recompiling), would you just ignore that feature and use the same
auxiliary methods you have always done?
That's not unlikely. Why would I change? You still haven't given any
reasons why your tools would be /better/. Even if they could do all I
needed to do for a particular project, "just as good" is not "better",
and does not encourage change.

I would still need "make" for everything else. I would, however, be
quite happy if there were some standard way to get the list of include
files needed by a C file, rather than using gcc-specific flags.
Post by bart
You don't see that the language taking over task (1) of the things that
makefiles do, and possibly (2) (of the list I posted; repeated below),
can streamline makefiles to make them shorter, simpler, easier to write
and to read, and with fewer opportunities to get stuff wrong?
That was a rhetorical question. Obviously not.
I've nothing against shorter or simpler makefiles. But as far as I can
see, you are just moving the same information from a makefile into the C
files.

Indeed, you are duplicating things - now your C files have to have
"#pragma module this, #pragma module that" in addition to having
"#include this.h, #include that.h". With my makefiles, all the "this"
and "that" is found automatically - writing the includes in the C code
is sufficient.
Post by bart
Perhaps I would find your tools worked for a "Hello, world" project.
Maybe they were still okay as it got slightly bigger.  Then I'd have
something that they could not handle, and I'd reach for make.  What
would be the point of using "make" to automate - for example -
post-processing of a binary to add a CRC check, but using your tools
to handle the build?  It's much easier just to use "make" for the
whole thing.
Because building one binary is a process should be the job of a
compiler, not some random external tool that knows nothing of the
language or compiler.
No, it is the job of the linker. Compiling is the job of the compiler.
Controlling the build is the job of the build system. I don't see
monolithic applications as an advantage.
Post by bart
Maybe you think makefiles should individually list all the 1000s of
functions of a project too?
You are offering me a fish.  I am offering to teach you to fish,
including where to go to catch different kinds of fish.  This is
really a no-brainer choice.
That analogy makes no sense.
Let me try and explain what I do: I write whole-program compilers. That
means that, each time you do a new build, it will reprocess each file
from source. They use the language's module scheme to know which files
to process.
Surely most sensibly organised projects could then be built with :

bcc *.c -o prog.exe

I mean, that's what I can do with gcc if I had something that doesn't
need other flags (which is utterly impractical for my work).

Or if I had lots of files, each with their own c file :

for f in *.c; do gcc $i -o ${f%.c}; done
Post by bart
It works for me, and I'm sure could work for others if they didn't have
makefiles forced down their throats and hardwired into their brains.
/Nobody/ has makefiles forced on them. People use "make" because it is
convenient, and it works. If something better comes along, and it is
better enough to overcome the familiarity momentum, people will use that.

I do a round of checking the state of the art of build tools on a
regular basis - perhaps every year or so. I look at what's popular and
what's new, to see if there's anything that would work for me and be a
step up from what I have. So far, I've not found anything that comes
very close to "make" for my needs. There's some tools that are pretty
good in many ways, but none that I can see as being a better choice for
me than "make". I am, however, considering CMake (which works at a
higher level, and outputs makefiles, ninja files or other project
files). It appears to have some disadvantages compared to my makefiles,
such as needed to be run as an extra step when files are added or
removed to a project or dependencies are changed, but that doesn't
happen too often, and it's integration with other tools and projects
might make it an overall win. I'll need some time to investigate and
study it.

So I will happily move from "make" when I find something better - enough
better to make it worth the effort. I'll happily move from gcc, or
Linux, if I find something enough better to make it worth changing. I
regularly look at alternatives and consider them - clang is the key
challenger to gcc for my purposes.

But I have no interest in changing to something vastly more limited and
which adds nothing at all.
bart
2024-02-01 22:29:13 UTC
Permalink
Post by bart
You don't see that the language taking over task (1) of the things
that makefiles do, and possibly (2) (of the list I posted; repeated
below), can streamline makefiles to make them shorter, simpler, easier
to write and to read, and with fewer opportunities to get stuff wrong?
That was a rhetorical question. Obviously not.
I've nothing against shorter or simpler makefiles.  But as far as I can
see, you are just moving the same information from a makefile into the C
files.
Indeed, you are duplicating things - now your C files have to have
"#pragma module this, #pragma module that" in addition to having
"#include this.h, #include that.h".  With my makefiles, all the "this"
and "that" is found automatically - writing the includes in the C code
is sufficient.
I don't think so. Seeing:

#include "file.h"

doesn't necessarily mean there is a matching "file.c". It might not
exist, or the header might be for some external library, or maybe it
does exist but in a different location.

Or maybe some code may use a file "fred.c", which needs to be submitted
to the compiler, but for which there is either no header used, or uses a
header file with a different name.

As I said, C's uses of .h and .c files are chaotic.

Did you have in mind using gcc's -MM option? For my 'cipher.c' demo,
that only gives a set of header names. Missing are hmac.c and sha2.c.

If I try it on lua.c, it gives me only 5 header files; the project
comprises 33 .c files and 27 .h files.
Post by bart
Post by David Brown
Perhaps I would find your tools worked for a "Hello, world" project.
Maybe they were still okay as it got slightly bigger.  Then I'd have
something that they could not handle, and I'd reach for make.  What
would be the point of using "make" to automate - for example -
post-processing of a binary to add a CRC check, but using your tools
to handle the build?  It's much easier just to use "make" for the
whole thing.
Because building one binary is a process should be the job of a
compiler, not some random external tool that knows nothing of the
language or compiler.
No, it is the job of the linker.
There is where you're still stuck in the past.

I first got rid of a formal 'linker' about 40 years ago. I got rid of
the notion of combining independently compiled modules into an
executable a decade ago.

Linking would only come up for me if I wanted to statically combine the
outputs of several languages. Since I can't process object files, I need
to generate an object file (in my case, it represents ALL my modules),
and a traditional linker. That would be someone else's job.
  Compiling is the job of the compiler.
Controlling the build is the job of the build system.  I don't see
monolithic applications as an advantage.
I do. You type:

cc prog

without knowing or caring whether the contains that one module, or there
are 99 more.

In any case, your linker will generate a monolithic binary whether you
like it or not.

But I suspect you don't understand what a 'whole-program compiler' does:

* It means that for each binary, all sources are recompiled at the same
time to create it

* It doesn't mean that an application can only comprise one binary

* It moves the compilation unit granularity from a module to a single
EXE or DLL file

* Interfaces (in the case of a lower level language), are moved inter-
module to inter-program. The boundaries are between one program or
library and another, not between modules.

A language which claims to have a module system, but still compiles a
module at a time, will probably still have discrete inter-module
interfaces, although they may be handled automatically.
Post by bart
Maybe you think makefiles should individually list all the 1000s of
functions of a project too?
Post by David Brown
You are offering me a fish.  I am offering to teach you to fish,
including where to go to catch different kinds of fish.  This is
really a no-brainer choice.
That analogy makes no sense.
Let me try and explain what I do: I write whole-program compilers.
That means that, each time you do a new build, it will reprocess each
file from source. They use the language's module scheme to know which
files to process.
    bcc *.c -o prog.exe
I mean, that's what I can do with gcc if I had something that doesn't
need other flags (which is utterly impractical for my work).
Yes, that's one technique that can be used. But few projects are like
that one. One or two, you can try *.c and it will work.

Malcolm's resource compiler is like that, but it still benefits from a
file like this:

#pragma module "*.c"
#pragma module "freetype/*.c"
#pragma module "samplerate/*.c"

here called bbx.c. I can build it like this:

c:\bbx\src>mcc bbx
Compiling bbx.c to bbx.exe
/Nobody/ has makefiles forced on them.  People use "make" because it is
convenient, and it works.
BUT IT DOESN'T. It fails a lot of the time on Windows, but they are too
complicated to figure out why. From a recent thread I made about trying
to build piet.c, it failed on extra programs that weren't needed (that
was on Linux; it didn't work at all on Windows).

This is a program which actually only needed:

cc piet.c

(Here cc *.c wouldn't work.) This mirrors pretty much what I see in most
C projects; needless complexity that muddies the waters and creates
failures.

ALL I WANT IS A LIST OF FILES. Why doesn't anybody get that? And why is
it so hard?

Apparently makefiles are superior because you don't even need to know
the name of the program (and will have to hunt for where it put the
executable because it won't tell you!).
But I have no interest in changing to something vastly more limited and
which adds nothing at all.
That's right; it adds nothing, but it takes a lot away! Like a lot of
failure points.

(Look at the Monty Hall problem, but instead of 3 doors, try it with
100, of which 98 will be opened. Then it will easy to make the right
decision because nearly all the wrong ones have been eliminated.)
Keith Thompson
2024-02-01 23:28:03 UTC
Permalink
bart <***@freeuk.com> writes:
[...]
Post by bart
As I said, C's uses of .h and .c files are chaotic.
C doesn't use .h and .c files. The C standard doesn't specify file
extensions, either for source files or for files included with #include.

It's fairly straightforward to implement something similar to "modules"
in C, using matching *.h and *.c files, include guards, and so forth,
but it requires a bit of discipline. It's a mechanism built on top of
the language, not a feature of the language itself (though of course the
language definition intentionally supports that usage).

Some projects might use .h and .c files in a chaotic manner. Most, in
my experience, do not.
--
Keith Thompson (The_Other_Keith) Keith.S.Thompson+***@gmail.com
Working, but not speaking, for Medtronic
void Void(void) { Void(); } /* The recursive call of the void */
Lawrence D'Oliveiro
2024-02-02 01:03:09 UTC
Permalink
Post by Keith Thompson
The C standard doesn't specify file
extensions, either for source files or for files included with #include.
It does for the standard library includes, though.
Keith Thompson
2024-02-02 01:42:32 UTC
Permalink
Post by Lawrence D'Oliveiro
Post by Keith Thompson
The C standard doesn't specify file
extensions, either for source files or for files included with #include.
It does for the standard library includes, though.
Strictly speaking, it doesn't specify that the standard library headers
are files. But yes, their names end in ".h", and that's certainly
because of the common convention to use ".h" as the extension for C
header files.
--
Keith Thompson (The_Other_Keith) Keith.S.Thompson+***@gmail.com
Working, but not speaking, for Medtronic
void Void(void) { Void(); } /* The recursive call of the void */
Lawrence D'Oliveiro
2024-02-02 02:43:51 UTC
Permalink
Post by Keith Thompson
Post by Lawrence D'Oliveiro
The C standard doesn't specify file extensions, either for source
files or for files included with #include.
It does for the standard library includes, though.
Strictly speaking, it doesn't specify that the standard library headers
are files.
From the C99 spec, page 149:

6.10.2 Source file inclusion
Constraints
A #include directive shall identify a header or source file that
can be processed by the implementation.

...

3 A preprocessing directive of the form
# include "q-char-sequence" new-line
causes the replacement of that directive by the entire contents of
the source file identified by the specified sequence between the "
delimiters. The named source file is searched for in an
implementation-defined manner.

So you see, the spec very explicitly uses the term “file”.

<https://www.open-std.org/JTC1/SC22/WG14/www/docs/n869/>
Keith Thompson
2024-02-02 03:03:38 UTC
Permalink
Post by Lawrence D'Oliveiro
Post by Keith Thompson
Post by Lawrence D'Oliveiro
The C standard doesn't specify file extensions, either for source
files or for files included with #include.
It does for the standard library includes, though.
Strictly speaking, it doesn't specify that the standard library headers
are files.
6.10.2 Source file inclusion
Constraints
A #include directive shall identify a header or source file that
can be processed by the implementation.
...
3 A preprocessing directive of the form
# include "q-char-sequence" new-line
causes the replacement of that directive by the entire contents of
the source file identified by the specified sequence between the "
delimiters. The named source file is searched for in an
implementation-defined manner.
So you see, the spec very explicitly uses the term “file”.
<https://www.open-std.org/JTC1/SC22/WG14/www/docs/n869/>
Yes, but not in reference to the standard headers.

A #include directive with <> searches for a "header", which is not
stated to be a file. A #include directive with "" searches for a file
in an implementation-defined manner; if that search fails, it tries
again as if <> had been used.

References to standard headers (stdio.h et al) always use the <> syntax.
You can write `#include "stdio.h"` if you like, but it risks picking up
a file with the same name instead of the standard header (which *might*
be what you want).

BTW, the n1256.pdf draft is a close approximation to the C99 standard;
it consists of the published standard with the three Technical
Corrigenda merged into it. The n1570.pdf draft is the last publicly
release draft before C11 was published, and is close enough to C11 for
most purposes.
--
Keith Thompson (The_Other_Keith) Keith.S.Thompson+***@gmail.com
Working, but not speaking, for Medtronic
void Void(void) { Void(); } /* The recursive call of the void */
David Brown
2024-02-02 09:54:21 UTC
Permalink
Post by Keith Thompson
Post by Lawrence D'Oliveiro
Post by Keith Thompson
Post by Lawrence D'Oliveiro
The C standard doesn't specify file extensions, either for source
files or for files included with #include.
It does for the standard library includes, though.
Strictly speaking, it doesn't specify that the standard library headers
are files.
6.10.2 Source file inclusion
Constraints
A #include directive shall identify a header or source file that
can be processed by the implementation.
...
3 A preprocessing directive of the form
# include "q-char-sequence" new-line
causes the replacement of that directive by the entire contents of
the source file identified by the specified sequence between the "
delimiters. The named source file is searched for in an
implementation-defined manner.
So you see, the spec very explicitly uses the term “file”.
<https://www.open-std.org/JTC1/SC22/WG14/www/docs/n869/>
Yes, but not in reference to the standard headers.
A #include directive with <> searches for a "header", which is not
stated to be a file. A #include directive with "" searches for a file
in an implementation-defined manner; if that search fails, it tries
again as if <> had been used.
References to standard headers (stdio.h et al) always use the <> syntax.
You can write `#include "stdio.h"` if you like, but it risks picking up
a file with the same name instead of the standard header (which *might*
be what you want).
BTW, the n1256.pdf draft is a close approximation to the C99 standard;
it consists of the published standard with the three Technical
Corrigenda merged into it. The n1570.pdf draft is the last publicly
release draft before C11 was published, and is close enough to C11 for
most purposes.
In 7.1.2 "Standard headers", it says:

"""
Each library function is declared, with a type that includes a
prototype, in a header, 188) whose contents are made available by the
#include preprocessing directive.
"""

"Header" here is in italics, meaning it is a definition of the term.
And footnote 188 has :

"""
header is not necessarily a source file, nor are the < and > delimited
sequences in header names necessarily valid source file names.
"""

(I am quoting from n2346, the final C18 draft. The section numbering is
generally consistent between standard versions, but footnote numbers
change, in case anyone is looking this up.)


I have personally used a toolchain where the standard library headers
did not exist as files, but were internal to the compiler (and the
implementations were internal to the linker). I think the toolchain
company was a bit paranoid that others would copy their proprietary library.
tTh
2024-02-02 02:22:46 UTC
Permalink
   cc prog
without knowing or caring whether the contains that one module, or there
are 99 more.
I also do. You type:

make prog

without knowing or caring whether the contains that one module, or
there are 51 more.
--
+---------------------------------------------------------------------+
| https://tube.interhacker.space/a/tth/video-channels |
+---------------------------------------------------------------------+
bart
2024-02-02 11:13:13 UTC
Permalink
    cc prog
without knowing or caring whether the contains that one module, or
there are 99 more.
   make prog
without knowing or caring whether the contains that one module, or
there are 51 more.
Really? OK, let's try it:

c:\c>make cipher
cc cipher.c -o cipher
C:\tdm\bin\ld.exe:
C:\Users\44775\AppData\Local\Temp\ccRvFIdY.o:cipher.c:(.text+0x55a):
undefined reference to `hmac_sha256_final'

It seems I do need to care after all!

Oh, you mean I don't need to care AFTER I've created a complicated
makefile containing all those details that you claim I don't need to
bother with?

Let's try with a real solution:

c:\c>mcc cipher
Compiling cipher.c to cipher.exe


Or here's one where I don't need to add anything to the C code:

c:\c>bcc -auto cipher
1 Compiling cipher.c to cipher.asm (Pass 1)
* 2 Compiling hmac.c to hmac.asm (Pass 2)
* 3 Compiling sha2.c to sha2.asm (Pass 2)
Assembling to cipher.exe

I'm the one who's trying innovative approaches to minimise the extra
gumph you need to provide to build programs.

You're the one who needs to first write a pile of garbage within a
makefile in order for you to do:

make prog

Below is the makefile needed to build lua 5.4, which is a project of
only 35 C modules. Simple, isn't it?

---------------------------------
# Makefile for building Lua
# See ../doc/readme.html for installation and customization instructions.

# == CHANGE THE SETTINGS BELOW TO SUIT YOUR ENVIRONMENT
=======================

# Your platform. See PLATS for possible values.
PLAT= guess

CC= gcc -std=gnu99
CFLAGS= -O2 -Wall -Wextra -DLUA_COMPAT_5_3 $(SYSCFLAGS) $(MYCFLAGS)
LDFLAGS= $(SYSLDFLAGS) $(MYLDFLAGS)
LIBS= -lm $(SYSLIBS) $(MYLIBS)

AR= ar rcu
RANLIB= ranlib
RM= rm -f
UNAME= uname

SYSCFLAGS=
SYSLDFLAGS=
SYSLIBS=

MYCFLAGS=
MYLDFLAGS=
MYLIBS=
MYOBJS=

# Special flags for compiler modules; -Os reduces code size.
CMCFLAGS=

# == END OF USER SETTINGS -- NO NEED TO CHANGE ANYTHING BELOW THIS LINE
=======

PLATS= guess aix bsd c89 freebsd generic ios linux linux-readline macosx
mingw posix solaris

LUA_A= liblua.a
CORE_O= lapi.o lcode.o lctype.o ldebug.o ldo.o ldump.o lfunc.o lgc.o
llex.o lmem.o lobject.o lopcodes.o lparser.o lstate.o lstring.o ltable.o
ltm.o lundump.o lvm.o lzio.o
LIB_O= lauxlib.o lbaselib.o lcorolib.o ldblib.o liolib.o lmathlib.o
loadlib.o loslib.o lstrlib.o ltablib.o lutf8lib.o linit.o
BASE_O= $(CORE_O) $(LIB_O) $(MYOBJS)

LUA_T= lua
LUA_O= lua.o

LUAC_T= luac
LUAC_O= luac.o

ALL_O= $(BASE_O) $(LUA_O) $(LUAC_O)
ALL_T= $(LUA_A) $(LUA_T) $(LUAC_T)
ALL_A= $(LUA_A)

# Targets start here.
default: $(PLAT)

all: $(ALL_T)

o: $(ALL_O)

a: $(ALL_A)

$(LUA_A): $(BASE_O)
$(AR) $@ $(BASE_O)
$(RANLIB) $@

$(LUA_T): $(LUA_O) $(LUA_A)
$(CC) -o $@ $(LDFLAGS) $(LUA_O) $(LUA_A) $(LIBS)

$(LUAC_T): $(LUAC_O) $(LUA_A)
$(CC) -o $@ $(LDFLAGS) $(LUAC_O) $(LUA_A) $(LIBS)

test:
./$(LUA_T) -v

clean:
$(RM) $(ALL_T) $(ALL_O)

depend:
@$(CC) $(CFLAGS) -MM l*.c

echo:
@echo "PLAT= $(PLAT)"
@echo "CC= $(CC)"
@echo "CFLAGS= $(CFLAGS)"
@echo "LDFLAGS= $(LDFLAGS)"
@echo "LIBS= $(LIBS)"
@echo "AR= $(AR)"
@echo "RANLIB= $(RANLIB)"
@echo "RM= $(RM)"
@echo "UNAME= $(UNAME)"

# Convenience targets for popular platforms.
ALL= all

help:
@echo "Do 'make PLATFORM' where PLATFORM is one of these:"
@echo " $(PLATS)"
@echo "See doc/readme.html for complete instructions."

guess:
@echo Guessing `$(UNAME)`
@$(MAKE) `$(UNAME)`

AIX aix:
$(MAKE) $(ALL) CC="xlc" CFLAGS="-O2 -DLUA_USE_POSIX -DLUA_USE_DLOPEN"
SYSLIBS="-ldl" SYSLDFLAGS="-brtl -bexpall"

bsd:
$(MAKE) $(ALL) SYSCFLAGS="-DLUA_USE_POSIX -DLUA_USE_DLOPEN"
SYSLIBS="-Wl,-E"

c89:
$(MAKE) $(ALL) SYSCFLAGS="-DLUA_USE_C89" CC="gcc -std=c89"
@echo ''
@echo '*** C89 does not guarantee 64-bit integers for Lua.'
@echo '*** Make sure to compile all external Lua libraries'
@echo '*** with LUA_USE_C89 to ensure consistency'
@echo ''

FreeBSD NetBSD OpenBSD freebsd:
$(MAKE) $(ALL) SYSCFLAGS="-DLUA_USE_LINUX -DLUA_USE_READLINE
-I/usr/include/edit" SYSLIBS="-Wl,-E -ledit" CC="cc"

generic: $(ALL)

ios:
$(MAKE) $(ALL) SYSCFLAGS="-DLUA_USE_IOS"

Linux linux: linux-noreadline

linux-noreadline:
$(MAKE) $(ALL) SYSCFLAGS="-DLUA_USE_LINUX" SYSLIBS="-Wl,-E -ldl"

linux-readline:
$(MAKE) $(ALL) SYSCFLAGS="-DLUA_USE_LINUX -DLUA_USE_READLINE"
SYSLIBS="-Wl,-E -ldl -lreadline"

Darwin macos macosx:
$(MAKE) $(ALL) SYSCFLAGS="-DLUA_USE_MACOSX -DLUA_USE_READLINE"
SYSLIBS="-lreadline"

mingw:
$(MAKE) "LUA_A=lua54.dll" "LUA_T=lua.exe" \
"AR=$(CC) -shared -o" "RANLIB=strip --strip-unneeded" \
"SYSCFLAGS=-DLUA_BUILD_AS_DLL" "SYSLIBS=" "SYSLDFLAGS=-s" lua.exe
$(MAKE) "LUAC_T=luac.exe" luac.exe

posix:
$(MAKE) $(ALL) SYSCFLAGS="-DLUA_USE_POSIX"

SunOS solaris:
$(MAKE) $(ALL) SYSCFLAGS="-DLUA_USE_POSIX -DLUA_USE_DLOPEN
-D_REENTRANT" SYSLIBS="-ldl"

# Targets that do not create files (not all makes understand .PHONY).
.PHONY: all $(PLATS) help test clean default o a depend echo

# Compiler modules may use special flags.
llex.o:
$(CC) $(CFLAGS) $(CMCFLAGS) -c llex.c

lparser.o:
$(CC) $(CFLAGS) $(CMCFLAGS) -c lparser.c

lcode.o:
$(CC) $(CFLAGS) $(CMCFLAGS) -c lcode.c

# DO NOT DELETE

lapi.o: lapi.c lprefix.h lua.h luaconf.h lapi.h llimits.h lstate.h \
lobject.h ltm.h lzio.h lmem.h ldebug.h ldo.h lfunc.h lgc.h lstring.h \
ltable.h lundump.h lvm.h
lauxlib.o: lauxlib.c lprefix.h lua.h luaconf.h lauxlib.h
lbaselib.o: lbaselib.c lprefix.h lua.h luaconf.h lauxlib.h lualib.h
lcode.o: lcode.c lprefix.h lua.h luaconf.h lcode.h llex.h lobject.h \
llimits.h lzio.h lmem.h lopcodes.h lparser.h ldebug.h lstate.h ltm.h \
ldo.h lgc.h lstring.h ltable.h lvm.h
lcorolib.o: lcorolib.c lprefix.h lua.h luaconf.h lauxlib.h lualib.h
lctype.o: lctype.c lprefix.h lctype.h lua.h luaconf.h llimits.h
ldblib.o: ldblib.c lprefix.h lua.h luaconf.h lauxlib.h lualib.h
ldebug.o: ldebug.c lprefix.h lua.h luaconf.h lapi.h llimits.h lstate.h \
lobject.h ltm.h lzio.h lmem.h lcode.h llex.h lopcodes.h lparser.h \
ldebug.h ldo.h lfunc.h lstring.h lgc.h ltable.h lvm.h
ldo.o: ldo.c lprefix.h lua.h luaconf.h lapi.h llimits.h lstate.h \
lobject.h ltm.h lzio.h lmem.h ldebug.h ldo.h lfunc.h lgc.h lopcodes.h \
lparser.h lstring.h ltable.h lundump.h lvm.h
ldump.o: ldump.c lprefix.h lua.h luaconf.h lobject.h llimits.h lstate.h \
ltm.h lzio.h lmem.h lundump.h
lfunc.o: lfunc.c lprefix.h lua.h luaconf.h ldebug.h lstate.h lobject.h \
llimits.h ltm.h lzio.h lmem.h ldo.h lfunc.h lgc.h
lgc.o: lgc.c lprefix.h lua.h luaconf.h ldebug.h lstate.h lobject.h \
llimits.h ltm.h lzio.h lmem.h ldo.h lfunc.h lgc.h lstring.h ltable.h
linit.o: linit.c lprefix.h lua.h luaconf.h lualib.h lauxlib.h
liolib.o: liolib.c lprefix.h lua.h luaconf.h lauxlib.h lualib.h
llex.o: llex.c lprefix.h lua.h luaconf.h lctype.h llimits.h ldebug.h \
lstate.h lobject.h ltm.h lzio.h lmem.h ldo.h lgc.h llex.h lparser.h \
lstring.h ltable.h
lmathlib.o: lmathlib.c lprefix.h lua.h luaconf.h lauxlib.h lualib.h
lmem.o: lmem.c lprefix.h lua.h luaconf.h ldebug.h lstate.h lobject.h \
llimits.h ltm.h lzio.h lmem.h ldo.h lgc.h
loadlib.o: loadlib.c lprefix.h lua.h luaconf.h lauxlib.h lualib.h
lobject.o: lobject.c lprefix.h lua.h luaconf.h lctype.h llimits.h \
ldebug.h lstate.h lobject.h ltm.h lzio.h lmem.h ldo.h lstring.h lgc.h \
lvm.h
lopcodes.o: lopcodes.c lprefix.h lopcodes.h llimits.h lua.h luaconf.h
loslib.o: loslib.c lprefix.h lua.h luaconf.h lauxlib.h lualib.h
lparser.o: lparser.c lprefix.h lua.h luaconf.h lcode.h llex.h lobject.h \
llimits.h lzio.h lmem.h lopcodes.h lparser.h ldebug.h lstate.h ltm.h \
ldo.h lfunc.h lstring.h lgc.h ltable.h
lstate.o: lstate.c lprefix.h lua.h luaconf.h lapi.h llimits.h lstate.h \
lobject.h ltm.h lzio.h lmem.h ldebug.h ldo.h lfunc.h lgc.h llex.h \
lstring.h ltable.h
lstring.o: lstring.c lprefix.h lua.h luaconf.h ldebug.h lstate.h \
lobject.h llimits.h ltm.h lzio.h lmem.h ldo.h lstring.h lgc.h
lstrlib.o: lstrlib.c lprefix.h lua.h luaconf.h lauxlib.h lualib.h
ltable.o: ltable.c lprefix.h lua.h luaconf.h ldebug.h lstate.h lobject.h \
llimits.h ltm.h lzio.h lmem.h ldo.h lgc.h lstring.h ltable.h lvm.h
ltablib.o: ltablib.c lprefix.h lua.h luaconf.h lauxlib.h lualib.h
ltm.o: ltm.c lprefix.h lua.h luaconf.h ldebug.h lstate.h lobject.h \
llimits.h ltm.h lzio.h lmem.h ldo.h lgc.h lstring.h ltable.h lvm.h
lua.o: lua.c lprefix.h lua.h luaconf.h lauxlib.h lualib.h
luac.o: luac.c lprefix.h lua.h luaconf.h lauxlib.h ldebug.h lstate.h \
lobject.h llimits.h ltm.h lzio.h lmem.h lopcodes.h lopnames.h lundump.h
lundump.o: lundump.c lprefix.h lua.h luaconf.h ldebug.h lstate.h \
lobject.h llimits.h ltm.h lzio.h lmem.h ldo.h lfunc.h lstring.h lgc.h \
lundump.h
lutf8lib.o: lutf8lib.c lprefix.h lua.h luaconf.h lauxlib.h lualib.h
lvm.o: lvm.c lprefix.h lua.h luaconf.h ldebug.h lstate.h lobject.h \
llimits.h ltm.h lzio.h lmem.h ldo.h lfunc.h lgc.h lopcodes.h lstring.h \
ltable.h lvm.h ljumptab.h
lzio.o: lzio.c lprefix.h lua.h luaconf.h llimits.h lmem.h lstate.h \
lobject.h ltm.h lzio.h

# (end of Makefile)
Gary R. Schmidt
2024-02-02 13:25:23 UTC
Permalink
On 02/02/2024 22:13, bart wrote:
[Bitching about "make" snipped]

Try "cake", Zoltan wrote it many decades ago, when we were at
$GOSHWHATAUNIVERSITY, because he thought "make" was too prolix.

Cheers,
Gary B-)
bart
2024-02-02 13:29:53 UTC
Permalink
Post by bart
You're the one who needs to first write a pile of garbage within a
make prog
Below is the makefile needed to build lua 5.4, which is a project of
only 35 C modules. Simple, isn't it?
Post by bart
---------------------------------
# Makefile for building Lua
# See ../doc/readme.html for installation and customization instructions.
# == CHANGE THE SETTINGS BELOW TO SUIT YOUR ENVIRONMENT
Now this is an interesting comment. The makefile is set up for gcc. For
another compiler it won't work.

If I try to switch to 'tcc', there are a number of problems. First,
unless you do 'make clean', the .o files lying about (I guess a
consequence of being to do incremental builds), are incompatible.

At this point I discovered a bug in the makefile for Lua (you might say
it's not bug, it's one of the settings that need changing, but I've no
idea how or where):

Although this makefile works with gcc on Windows, it thinks the
executable is called 'lua', not 'lua.exe'. It will produce 'lua.exe'
with gcc, but it checks for the existence of 'lua'.

That is never present, so it always links; it never says 'is up-to-date'.

With tcc however, there's another issue: tcc requires the .exe extension
in the -o option, otherwise it writes the executable as 'lua'. Now, at
last, make sees 'lua' and deems it up-to-date. Unfortunately that won't
run under Windows.

Either not at all, or it will use the lua.exe left over from gcc. I can
bodge this by using '-o $@.exe', producing lua.exe from tcc, but make is
still checking 'lua'.

There are some minor things: tcc doesn't like the -lm option for example.

But what it comes down to is that it seems I need a separate makefile
for each compiler. As supplied, it didn't even work 100% for gcc on Windows.

That means duplicating all that file info.

This is a solution I used before, using this @ file:

------------------------------
-O2 -s -o lua.exe
lua.c lapi.c lcode.c lctype.c ldebug.c ldo.c ldump.c lfunc.c lgc.c
llex.c lmem.c lobject.c lopcodes.c lparser.c lstate.c lstring.c
ltable.c ltm.c lundump.c lvm.c lzio.c lauxlib.c lbaselib.c lcorolib.c
ldblib.c liolib.c lmathlib.c loadlib.c loslib.c lstrlib.c ltablib.c
lutf8lib.c linit.c
------------------------------


If I run it like this:

gcc @luafiles

it produces a 260KB executable. Which is another interesting thing:
using 'make lua' set up for gcc produces a 360KB executable.

But I can also run it like this:

tcc @luafiles

The same file works for both gcc and tcc.

It won't work for mcc unless I split it into two, as that first line of
options doesn't work there. However with mcc I can now just do this:

mcc lua

So two solutions for this project that (1) don't involve a makefile; (2)
work better than the makefile.

It's true that it involved recompiling every module. But tcc still
builds this project in 0.3 seconds.

This project contains 34 C files, or which 33 are needed (not 35 as I
said). That means that using *.c is not possible, unless that extra file
(I believe used when building a shared library) is renamed.

If that is done, then all compilers just need "*.c" plus whatever other
options are needed.
David Brown
2024-02-02 09:47:12 UTC
Permalink
Post by bart
You don't see that the language taking over task (1) of the things
that makefiles do, and possibly (2) (of the list I posted; repeated
below), can streamline makefiles to make them shorter, simpler,
easier to write and to read, and with fewer opportunities to get
stuff wrong?
That was a rhetorical question. Obviously not.
I've nothing against shorter or simpler makefiles.  But as far as I
can see, you are just moving the same information from a makefile into
the C files.
Indeed, you are duplicating things - now your C files have to have
"#pragma module this, #pragma module that" in addition to having
"#include this.h, #include that.h".  With my makefiles, all the "this"
and "that" is found automatically - writing the includes in the C code
is sufficient.
    #include "file.h"
doesn't necessarily mean there is a matching "file.c". It might not
exist, or the header might be for some external library, or maybe it
does exist but in a different location.
As I said, you are duplicating things.

For my builds, I do not have anywhere that I need to specify "file.c".
Or maybe some code may use a file "fred.c", which needs to be submitted
to the compiler, but for which there is either no header used, or uses a
header file with a different name.
As I said, C's uses of .h and .c files are chaotic.
My uses of .h and .c files are not chaotic.

Maybe you can't write well-structured C programs. Certainly not
everyone can. (And /please/ do not give another list of open source
programs that you don't like. I didn't write them. I can tell you how
and why /I/ organise my projects and makefiles - I don't speak for others.)
Did you have in mind using gcc's -MM option? For my 'cipher.c' demo,
that only gives a set of header names.  Missing are hmac.c and sha2.c.
I use makefiles where gcc's "-M" options are part of the solution - not
the whole solution.
If I try it on lua.c, it gives me only 5 header files; the project
comprises 33 .c files and 27 .h files.
I don't care. I did not write lua.

But I /have/ integrated lua with one of my projects, long ago. It fit
into my makefile format without trouble - I added the lua directory as a
subdirectory of my source directory, and that was all that was needed.
Post by bart
Post by David Brown
Perhaps I would find your tools worked for a "Hello, world" project.
Maybe they were still okay as it got slightly bigger.  Then I'd have
something that they could not handle, and I'd reach for make.  What
would be the point of using "make" to automate - for example -
post-processing of a binary to add a CRC check, but using your tools
to handle the build?  It's much easier just to use "make" for the
whole thing.
Because building one binary is a process should be the job of a
compiler, not some random external tool that knows nothing of the
language or compiler.
No, it is the job of the linker.
There is where you're still stuck in the past.
I first got rid of a formal 'linker' about 40 years ago. I got rid of
the notion of combining independently compiled modules into an
executable a decade ago.
No, you built a monolithic tool that /included/ the linker. That's fine
for niche tools that are not intended to work with anything else. Most
people work with many tools - that's why we have standards, defined file
formats, and flexible tools with wide support.

Other people got rid of monolithic tools forty years ago when they
realised it was a terrible way to organise things.
I know exactly what it does. I am entirely without doubt that I know
the point and advantages of them better than you do - the /real/ points
and advantages, not some pathetic "it means I don't have to use that
horrible nasty make program" reason.
* It means that for each binary, all sources are recompiled at the same
  time to create it
No, it does not.
* It doesn't mean that an application can only comprise one binary
Correct.
* It moves the compilation unit granularity from a module to a single
  EXE or DLL file
No, it does not.
* Interfaces (in the case of a lower level language), are moved inter-
  module to inter-program. The boundaries are between one program or
  library and another, not between modules.
Correct.
A language which claims to have a module system, but still compiles a
module at a time, will probably still have discrete inter-module
interfaces, although they may be handled automatically.
Correct.


In real-world whole program compilation systems, the focus is on
inter-module optimisations. Total build times are expected to go /up/.
Build complexity can be much higher, especially for large programs. It
is more often used for C++ than C.

The main point of a lot of whole-program compilation is to allow
cross-module optimisation. It means you can have "access" functions
hidden away in implementation files so that you avoid global variables
or inter-dependencies between modules, but now they can be inline across
modules so that you have no overhead or costs for this. It means you
can write code that is more structured and modular, with different teams
handling different parts, and with layers of abstractions, but when you
pull it all together into one whole-program build, the run-time costs
and overhead for this all disappear. And it means lots of checks and
static analysis can be done across the whole program.


For such programs, each translation unit is still compiled separately,
but the "object" files contain internal data structures and analysis
information, rather than generated code. Lots of the work is done by
this point, with inter-procedural optimisations done within the unit.
These compilations will be done as needed, in parallel, under the
control of a build system. Then they are combined for the linking and
link-time optimisation which fits the parts together. Doing this in a
scalable way is hard, and the subject of a lot of research, as you need
to partition it into chunks that can be handled in parallel on multiple
cpu cores (or even distributed amongst servers). Once you have parts of
code that are ready, they are handed on to backend compilers that do
more optimisation and generate the object code, and this in turn is
linked (sometimes incrementally in parts, again aiming at improving
parallel building and scalability.


You go to all this effort because you are building software that is used
by millions of people, and your build effort is minor compared to the
total improvements for all users combined. Or you do it because you are
building speed-critical software. Or you want the best static analysis
you can get, and want that done across modules. Or you are building
embedded systems that need to be as efficient as possible.

You don't do it because you find "make" ugly.


It is also very useful on old-fashioned microcontrollers with multiple
banks for data ram and code memory, and no good data stack access - the
compiler can do large-scale lifetime analysis and optimise placement and
the re-use of the very limited ram.
/Nobody/ has makefiles forced on them.  People use "make" because it
is convenient, and it works.
BUT IT DOESN'T.
IT DOES WORK.

People use it all the time.
It fails a lot of the time on Windows, but they are too
complicated to figure out why.
People use it all the time on Windows.

Even Microsoft ships its own version of make, "nmake.exe", and has done
for decades.

/You/ can't work it, but you excel at failing to get things working.
You have a special gift - you just have to look at a computer with tools
that you didn't write yourself, and it collapses.
But I have no interest in changing to something vastly more limited
and which adds nothing at all.
That's right; it adds nothing, but it takes a lot away! Like a lot of
failure points.
Like pretty much everything I need.
Michael S
2024-02-02 13:45:31 UTC
Permalink
On Fri, 2 Feb 2024 10:47:12 +0100
Post by David Brown
Post by bart
You don't see that the language taking over task (1) of the
things that makefiles do, and possibly (2) (of the list I posted;
repeated below), can streamline makefiles to make them shorter,
simpler, easier to write and to read, and with fewer
opportunities to get stuff wrong?
That was a rhetorical question. Obviously not.
I've nothing against shorter or simpler makefiles.  But as far as
I can see, you are just moving the same information from a
makefile into the C files.
Indeed, you are duplicating things - now your C files have to have
"#pragma module this, #pragma module that" in addition to having
"#include this.h, #include that.h".  With my makefiles, all the
"this" and "that" is found automatically - writing the includes in
the C code is sufficient.
    #include "file.h"
doesn't necessarily mean there is a matching "file.c". It might not
exist, or the header might be for some external library, or maybe
it does exist but in a different location.
As I said, you are duplicating things.
For my builds, I do not have anywhere that I need to specify "file.c".
Or maybe some code may use a file "fred.c", which needs to be
submitted to the compiler, but for which there is either no header
used, or uses a header file with a different name.
As I said, C's uses of .h and .c files are chaotic.
My uses of .h and .c files are not chaotic.
Maybe you can't write well-structured C programs. Certainly not
everyone can. (And /please/ do not give another list of open source
programs that you don't like. I didn't write them. I can tell you
how and why /I/ organise my projects and makefiles - I don't speak
for others.)
Did you have in mind using gcc's -MM option? For my 'cipher.c'
demo, that only gives a set of header names.  Missing are hmac.c
and sha2.c.
I use makefiles where gcc's "-M" options are part of the solution -
not the whole solution.
If I try it on lua.c, it gives me only 5 header files; the project
comprises 33 .c files and 27 .h files.
I don't care. I did not write lua.
But I /have/ integrated lua with one of my projects, long ago. It
fit into my makefile format without trouble - I added the lua
directory as a subdirectory of my source directory, and that was all
that was needed.
Post by bart
Post by David Brown
Perhaps I would find your tools worked for a "Hello, world"
project. Maybe they were still okay as it got slightly bigger.
Then I'd have something that they could not handle, and I'd
reach for make.  What would be the point of using "make" to
automate - for example - post-processing of a binary to add a
CRC check, but using your tools to handle the build?  It's much
easier just to use "make" for the whole thing.
Because building one binary is a process should be the job of a
compiler, not some random external tool that knows nothing of the
language or compiler.
No, it is the job of the linker.
There is where you're still stuck in the past.
I first got rid of a formal 'linker' about 40 years ago. I got rid
of the notion of combining independently compiled modules into an
executable a decade ago.
No, you built a monolithic tool that /included/ the linker. That's
fine for niche tools that are not intended to work with anything
else. Most people work with many tools - that's why we have
standards, defined file formats, and flexible tools with wide support.
Other people got rid of monolithic tools forty years ago when they
realised it was a terrible way to organise things.
Actually, nowadays monolithic tools are solid majority in programming.
I mean, programming in general, not C/C++/Fortran programming which by
itself is a [sizable] minority.
Even in C++, a majority uses non-monolithic tools well-hidden behind
front end (IDE) that makes them indistinguishable from monolithic.
Post by David Brown
I know exactly what it does. I am entirely without doubt that I know
the point and advantages of them better than you do - the /real/
points and advantages, not some pathetic "it means I don't have to
use that horrible nasty make program" reason.
* It means that for each binary, all sources are recompiled at the
same time to create it
No, it does not.
* It doesn't mean that an application can only comprise one binary
Correct.
* It moves the compilation unit granularity from a module to a
single EXE or DLL file
No, it does not.
* Interfaces (in the case of a lower level language), are moved
inter- module to inter-program. The boundaries are between one
program or library and another, not between modules.
Correct.
A language which claims to have a module system, but still compiles
a module at a time, will probably still have discrete inter-module
interfaces, although they may be handled automatically.
Correct.
In real-world whole program compilation systems, the focus is on
inter-module optimisations. Total build times are expected to go
/up/. Build complexity can be much higher, especially for large
programs. It is more often used for C++ than C.
The main point of a lot of whole-program compilation is to allow
cross-module optimisation. It means you can have "access" functions
hidden away in implementation files so that you avoid global
variables or inter-dependencies between modules, but now they can be
inline across modules so that you have no overhead or costs for this.
It means you can write code that is more structured and modular,
with different teams handling different parts, and with layers of
abstractions, but when you pull it all together into one
whole-program build, the run-time costs and overhead for this all
disappear. And it means lots of checks and static analysis can be
done across the whole program.
For such programs, each translation unit is still compiled
separately, but the "object" files contain internal data structures
and analysis information, rather than generated code. Lots of the
work is done by this point, with inter-procedural optimisations done
within the unit. These compilations will be done as needed, in
parallel, under the control of a build system. Then they are
combined for the linking and link-time optimisation which fits the
parts together. Doing this in a scalable way is hard, and the
subject of a lot of research, as you need to partition it into chunks
that can be handled in parallel on multiple cpu cores (or even
distributed amongst servers). Once you have parts of code that are
ready, they are handed on to backend compilers that do more
optimisation and generate the object code, and this in turn is linked
(sometimes incrementally in parts, again aiming at improving parallel
building and scalability.
You go to all this effort because you are building software that is
used by millions of people, and your build effort is minor compared
to the total improvements for all users combined. Or you do it
because you are building speed-critical software. Or you want the
best static analysis you can get, and want that done across modules.
Or you are building embedded systems that need to be as efficient as
possible.
You don't do it because you find "make" ugly.
It is also very useful on old-fashioned microcontrollers with
multiple banks for data ram and code memory, and no good data stack
access - the compiler can do large-scale lifetime analysis and
optimise placement and the re-use of the very limited ram.
/Nobody/ has makefiles forced on them.  People use "make" because
it is convenient, and it works.
BUT IT DOESN'T.
IT DOES WORK.
People use it all the time.
It fails a lot of the time on Windows, but they are too
complicated to figure out why.
People use it all the time on Windows.
Even Microsoft ships its own version of make, "nmake.exe", and has
done for decades.
/You/ can't work it, but you excel at failing to get things working.
You have a special gift - you just have to look at a computer with
tools that you didn't write yourself, and it collapses.
But I have no interest in changing to something vastly more
limited and which adds nothing at all.
That's right; it adds nothing, but it takes a lot away! Like a lot
of failure points.
Like pretty much everything I need.
David Brown
2024-02-02 15:26:12 UTC
Permalink
Post by Michael S
Actually, nowadays monolithic tools are solid majority in programming.
I mean, programming in general, not C/C++/Fortran programming which by
itself is a [sizable] minority.
Even in C++, a majority uses non-monolithic tools well-hidden behind
front end (IDE) that makes them indistinguishable from monolithic.
It can often be helpful to have a single point of interaction - a
front-end that combines tools. But usually these are made of parts.

For many of the microcontrollers I work with, the manufacturer's
standard development toolset is based around Eclipse and gcc. From the
user point of view, it looks a lot like one monolithic IDE that lets you
write your code, compile and link it, and download and debug it on the
microcontroller. Under the hood, it is far from a monolithic
application. Different bits come from many different places. This
means the microcontroller manufacturer is only making the bits that are
specific to /their/ needs - such as special views while debugging, or
"wizards" for configuring chip pins. The Eclipse folk are experts at
making an editor and IDE, the gcc folks are experts at the compiler, the
openocd folks know about jtag debugging, and so on. And to a fair
extent, advanced users can use the bits they want and leave out other
bits. I sometimes use other editors, but might still use the toolchain
provided with the manufacturer's tools. I might swap out the debugger
connection. I might use the IDE for something completely different. I
might install additional features in the IDE. I might use different
toolchains. Manufacturers, when putting things together, might change
where they get their toolchains, or what debugging connectors they use.
It's even been known for them to swap out the base IDE while keeping
most of the rest the same (VS Code has become a popular choice now, and
a few use NetBeans rather than Eclipse).

(Oh, and for those that don't believe "make" and "gcc" work on Windows,
these development tools invariably have "make" and almost invariably use
gcc as their toolchain, all working in almost exactly the same way on
Linux and Windows. The only difference is builds are faster on Linux.)

This is getting the best (or at least, trying to) from all worlds. It
gives people the ease-of-use advantages of monolithic tools without the
key disadvantages of real monolithic tools - half-arse editors,
half-arsed project managers, half-arsed compilers, and poor
extensibility because the suppliers are trying to do far too much
themselves.

I don't think it is common now to have /real/ monolithic development
tools. But it is common to have front-ends aimed at making the
underlying tools easier and more efficient to use, and to provide
all-in-one base packages.
bart
2024-02-02 14:14:31 UTC
Permalink
Post by David Brown
Post by bart
As I said, C's uses of .h and .c files are chaotic.
My uses of .h and .c files are not chaotic.
We can't write tools that only work for careful users. Any open-source
project I want to build WILL be chaotic.

We can however write languages where you are forced to be more
disciplined. Mine doesn't have the equivalent of .h files for example.

However this is about C.
Post by David Brown
Post by bart
I first got rid of a formal 'linker' about 40 years ago. I got rid of
the notion of combining independently compiled modules into an
executable a decade ago.
No, you built a monolithic tool that /included/ the linker.
No, I ELIMINATED the linker.

And in the past, I wrote a program called a Loader, much simpler than a
linker, and very fast (it had to be as I worked with floppies).
Post by David Brown
  That's fine
for niche tools that are not intended to work with anything else.  Most
people work with many tools - that's why we have standards, defined file
formats, and flexible tools with wide support.
Other people got rid of monolithic tools forty years ago when they
realised it was a terrible way to organise things.
I know exactly what it does.  I am entirely without doubt that I know
the point and advantages of them better than you do
You can't create a language devised for whole-program compilation, and
implement a full-stack compiler for it, without learning a lot about the
ins and outs.

So I suspect I know a bit more about it than you do.

Probably you're mixing this up with whole-program optimisation.

- the /real/ points
Post by David Brown
and advantages, not some pathetic "it means I don't have to use that
horrible nasty make program" reason.
Post by bart
* It means that for each binary, all sources are recompiled at the same
   time to create it
No, it does not.
That's not a whole-program compiler then. Not if half the modules were
compiled last week!
Post by David Brown
Post by bart
* It doesn't mean that an application can only comprise one binary
Correct.
Post by bart
* It moves the compilation unit granularity from a module to a single
   EXE or DLL file
No, it does not.
Again, it can't be a whole-program compiler if it can compile modules
independently.
Post by David Brown
In real-world whole program compilation systems, the focus is on
inter-module optimisations.  Total build times are expected to go /up/.
Build complexity can be much higher, especially for large programs.  It
is more often used for C++ than C.
The main point of a lot of whole-program compilation is to allow
cross-module optimisation.  It means you can have "access" functions
hidden away in implementation files so that you avoid global variables
or inter-dependencies between modules, but now they can be inline across
modules so that you have no overhead or costs for this.  It means you
can write code that is more structured and modular, with different teams
handling different parts, and with layers of abstractions, but when you
pull it all together into one whole-program build, the run-time costs
and overhead for this all disappear.  And it means lots of checks and
static analysis can be done across the whole program.
For such programs, each translation unit is still compiled separately,
but the "object" files contain internal data structures and analysis
information, rather than generated code.  Lots of the work is done by
this point, with inter-procedural optimisations done within the unit.
These compilations will be done as needed, in parallel, under the
control of a build system.  Then they are combined for the linking and
link-time optimisation which fits the parts together.  Doing this in a
scalable way is hard, and the subject of a lot of research, as you need
to partition it into chunks that can be handled in parallel on multiple
cpu cores (or even distributed amongst servers).  Once you have parts of
code that are ready, they are handed on to backend compilers that do
more optimisation and generate the object code, and this in turn is
linked (sometimes incrementally in parts, again aiming at improving
parallel building and scalability.
You've just described a tremendously complex way to do whole-program
analysis.

There are easier ways. The C transpiler I use takes a project of dozens
of modules in my language, and produces a single C source file which
will form one EXE or one DLL file.

Now any ordinary optimising C compiler has a view of the entire program
and can do wider optimisations (but that view does not span multiple
EXE/DLL files.)
Post by David Brown
/You/ can't work it, but you excel at failing to get things working. You
have a special gift - you just have to look at a computer with tools
that you didn't write yourself, and it collapses.
Yes, I do. I'm like that kid poking fun at the emperor's new clothes;
I'm just stating what I see. But in one way it is hilarious seeing you
lot defend programs like 'as' to the death.

Why not just admit that it is a POS that you've had to learn to live
with, instead of trying to make out it is somehow superior?
Michael S
2024-02-02 14:43:51 UTC
Permalink
On Fri, 2 Feb 2024 14:14:31 +0000
Post by bart
Post by David Brown
Post by bart
As I said, C's uses of .h and .c files are chaotic.
My uses of .h and .c files are not chaotic.
We can't write tools that only work for careful users. Any
open-source project I want to build WILL be chaotic.
We can however write languages where you are forced to be more
disciplined. Mine doesn't have the equivalent of .h files for example.
However this is about C.
Post by David Brown
Post by bart
I first got rid of a formal 'linker' about 40 years ago. I got rid
of the notion of combining independently compiled modules into an
executable a decade ago.
No, you built a monolithic tool that /included/ the linker.
No, I ELIMINATED the linker.
And in the past, I wrote a program called a Loader, much simpler than
a linker, and very fast (it had to be as I worked with floppies).
Post by David Brown
  That's fine
for niche tools that are not intended to work with anything else.
Most people work with many tools - that's why we have standards,
defined file formats, and flexible tools with wide support.
Other people got rid of monolithic tools forty years ago when they
realised it was a terrible way to organise things.
I know exactly what it does.  I am entirely without doubt that I
know the point and advantages of them better than you do
You can't create a language devised for whole-program compilation,
and implement a full-stack compiler for it, without learning a lot
about the ins and outs.
So I suspect I know a bit more about it than you do.
Probably you're mixing this up with whole-program optimisation.
- the /real/ points
Post by David Brown
and advantages, not some pathetic "it means I don't have to use
that horrible nasty make program" reason.
Post by bart
* It means that for each binary, all sources are recompiled at the
same time to create it
No, it does not.
That's not a whole-program compiler then. Not if half the modules
were compiled last week!
Post by David Brown
Post by bart
* It doesn't mean that an application can only comprise one binary
Correct.
Post by bart
* It moves the compilation unit granularity from a module to a
single EXE or DLL file
No, it does not.
Again, it can't be a whole-program compiler if it can compile modules
independently.
Post by David Brown
In real-world whole program compilation systems, the focus is on
inter-module optimisations.  Total build times are expected to go
/up/. Build complexity can be much higher, especially for large
programs.  It is more often used for C++ than C.
The main point of a lot of whole-program compilation is to allow
cross-module optimisation.  It means you can have "access"
functions hidden away in implementation files so that you avoid
global variables or inter-dependencies between modules, but now
they can be inline across modules so that you have no overhead or
costs for this.  It means you can write code that is more
structured and modular, with different teams handling different
parts, and with layers of abstractions, but when you pull it all
together into one whole-program build, the run-time costs and
overhead for this all disappear.  And it means lots of checks and
static analysis can be done across the whole program.
For such programs, each translation unit is still compiled
separately, but the "object" files contain internal data structures
and analysis information, rather than generated code.  Lots of the
work is done by this point, with inter-procedural optimisations
done within the unit. These compilations will be done as needed, in
parallel, under the control of a build system.  Then they are
combined for the linking and link-time optimisation which fits the
parts together.  Doing this in a scalable way is hard, and the
subject of a lot of research, as you need to partition it into
chunks that can be handled in parallel on multiple cpu cores (or
even distributed amongst servers).  Once you have parts of code
that are ready, they are handed on to backend compilers that do
more optimisation and generate the object code, and this in turn is
linked (sometimes incrementally in parts, again aiming at improving
parallel building and scalability.
You've just described a tremendously complex way to do whole-program
analysis.
But it proves that your statement above (it can't be a whole-program
compiler if it can compile modules independently) is false.
Post by bart
There are easier ways. The C transpiler I use takes a project of
dozens of modules in my language, and produces a single C source file
which will form one EXE or one DLL file.
Now any ordinary optimising C compiler has a view of the entire
program and can do wider optimisations (but that view does not span
multiple EXE/DLL files.)
If the program in question is really big then there is a good chance
that your method will expose internal limits of the back-end compiler.
I think, that's one of the reason (not the only one) why Mozilla didn't
re-write the whole Firefox in Rust. According to my understanding, Rust
does something similar to your approach, except that it outputs LLVM IR
instead of C and there are real concern that LLVM back end would have
troubles with input as big as the whole FF.
Post by bart
Post by David Brown
/You/ can't work it, but you excel at failing to get things
working. You have a special gift - you just have to look at a
computer with tools that you didn't write yourself, and it
collapses.
Yes, I do. I'm like that kid poking fun at the emperor's new clothes;
I'm just stating what I see. But in one way it is hilarious seeing
you lot defend programs like 'as' to the death.
Why not just admit that it is a POS that you've had to learn to live
with, instead of trying to make out it is somehow superior?
I never run gnu as directly. Running it by means of driver program
(personally I prefer clang for that task, but gcc will do the job as
well) isolates me from all peculiarities.
bart
2024-02-02 15:18:51 UTC
Permalink
Post by Michael S
On Fri, 2 Feb 2024 14:14:31 +0000
Post by bart
You've just described a tremendously complex way to do whole-program
analysis.
But it proves that your statement above (it can't be a whole-program
compiler if it can compile modules independently) is false.
Then /every/ compiler can be regarded as a whole-program one, since the
end result, even if the modules were randomly compiled over the last
month, will always be a whole program.

So it comes down to what is meant by a whole-program compiler.

My definition is where you build one program (eg. one EXE or DLL file on
Windows) with ONE invocaton of the compiler, which processes ALL source
and support files from scratch.

The output (from my compiler) is a single file, usually an EXE or DLL,
that may use external shared libraries. Or, rarely, it may generate a
single OBJ file for more exotic requirements, but it will need external
tools. Then it may end up as a component of a larger program.

Or sometimes the output is fixed up in memory and run immediately.

That describes the compiler I use for my systems language.

My C compiler is not a whole-program one. Although you can submit all
modules and it can produce one EXE/DLL file, so that the behaviour can
appear similar, internally they are compiled independently.

I have thought about using real whole-program techniques (so that all
modules share a global symbol table for example, and common headers are
processed only once), but I don't use C enough to make that interesting
to attempt.
Post by Michael S
Post by bart
There are easier ways. The C transpiler I use takes a project of
dozens of modules in my language, and produces a single C source file
which will form one EXE or one DLL file.
Now any ordinary optimising C compiler has a view of the entire
program and can do wider optimisations (but that view does not span
multiple EXE/DLL files.)
If the program in question is really big then there is a good chance
that your method will expose internal limits of the back-end compiler.
My personal whole-program projects impose some limitations.

One is the scale of the application being compiled. However they are
designed for use with a fast compiler. That puts an upper limit of about
0.5M lines per project, if you want to keep build time below, say, 1 second.

(Figures pertain to my slowish PC, running an unoptimised compiler, so
are conservative. An optimised compiler might be 40% faster.)

0.5M lines of code means about a 5MB executable, which is a pretty hefty
project. The vast majority of executables and libraries on my PC are
smaller than that.

Another is that whole-program compilation is harder to parallelise (the
above figures are for a single core). But you can of course compile
multiple programs at the same time.

The killer is that most professional compilers are hugely complex: they
are big, and they take considerable machine resources. These are ones
like gcc or any using LLVM.

So to get around that in order to do whole-program stuff, things get
very, very complicated.

I can't help that.

But I can't remember how we got here. The thread subject is the
far-simpler-to-realise topic of discovering the modules for a
non-whole-program C compiler, which seems to give the big boys a lot
more trouble!
tTh
2024-02-02 19:43:59 UTC
Permalink
Post by bart
My definition is where you build one program (eg. one EXE or DLL file on
Windows) with ONE invocaton of the compiler, which processes ALL source
and support files from scratch.
And can you disclose the magic trick who let your magic
compiler know exactly the list of "ALL source and support
files" needed for a scratchy build ?
--
+---------------------------------------------------------------------+
| https://tube.interhacker.space/a/tth/video-channels |
+---------------------------------------------------------------------+
David Brown
2024-02-02 15:31:46 UTC
Permalink
Post by bart
Yes, I do. I'm like that kid poking fun at the emperor's new clothes;
I'm just stating what I see. But in one way it is hilarious seeing you
lot defend programs like 'as' to the death.
No, /you/ are the emperor in this analogy. Well, you are actually the
kid - except you are the kid with no clothes who /thinks/ he's an emperor.
Post by bart
Why not just admit that it is a POS that you've had to learn to live
with, instead of trying to make out it is somehow superior?
The whole world is out of step, except Bart.

Has it never occurred to you that when you are in disagreement with
everyone, /you/ might be the one that is wrong? I think you suffer from
the "misunderstood genius" myth. It's surprisingly common amongst
people who have invested heavily in going their own way, against common
knowledge or common practice. It's a sort of psychological defence
mechanism against realising you've been wrong all this time.
bart
2024-02-02 17:00:12 UTC
Permalink
Post by bart
Yes, I do. I'm like that kid poking fun at the emperor's new clothes;
I'm just stating what I see. But in one way it is hilarious seeing you
lot defend programs like 'as' to the death.
No, /you/ are the emperor in this analogy.  Well, you are actually the
kid - except you are the kid with no clothes who /thinks/ he's an emperor.
Post by bart
Why not just admit that it is a POS that you've had to learn to live
with, instead of trying to make out it is somehow superior?
The whole world is out of step, except Bart.
Has it ever occurred to YOU that the world is more than Unix and make
and massive compilers like gcc and clang?
Has it never occurred to you that when you are in disagreement with
everyone, /you/ might be the one that is wrong?  I think you suffer from
the "misunderstood genius" myth.  It's surprisingly common amongst
people who have invested heavily in going their own way, against common
knowledge or common practice.  It's a sort of psychological defence
mechanism against realising you've been wrong all this time.
This is a newsgroup about C. That is a language that can be fairly
adequately implemented with a 180KB program, the size of Tiny C. Tiny C
itself can turn C source into binary at about 10MB per second.

So, a toy language, really, and a toy implementation that nevertheless
does the job: in most cases, a user of the resulting program will not be
able to tell how it was compiled.

And yet there is this massive collection of of huge, complex tools built
around a toy language, dwarfing it by 1000:1, that you insist is what
it's really all about, and you want to put down anyone who disagrees.

It's like saying that the only businesses worth having are huge
corporations, or the only form of transport must be a jetliner.

The way 'as' works IS rubbish. It is fascinating how you keep trying to
turn it round and make it about me. There can't possibly be anything
wrong with it, whoever says so must be deluded!
Kaz Kylheku
2024-02-02 17:31:40 UTC
Permalink
Post by bart
Has it ever occurred to YOU that the world is more than Unix and make
and massive compilers like gcc and clang?
There is more, but from your perspective, it's just more stuff to
shake your fist at and avoid learning about.
--
TXR Programming Language: http://nongnu.org/txr
Cygnal: Cygwin Native Application Library: http://kylheku.com/cygnal
Mastodon: @***@mstdn.ca
Keith Thompson
2024-02-02 18:36:15 UTC
Permalink
bart <***@freeuk.com> writes:
[...]
Post by bart
The way 'as' works IS rubbish. It is fascinating how you keep trying
to turn it round and make it about me. There can't possibly be
anything wrong with it, whoever says so must be deluded!
"as" works. It's not perfect, but it's good enough. Its job is to
translate assembly code to object code. It does that. There's is
nothing you could do with your preferred user interface (whatever that
might be) that can't be done with the existing one. "as" is rarely
invoked directly, so any slight clumsiness in its well defined user
interface hardly matters. Any changes to its interface could break
existing scripts.

Nobody is claiming that "there can't possibly be anything wrong with
it". You made that up.

Why does the way "as" works offend you?
--
Keith Thompson (The_Other_Keith) Keith.S.Thompson+***@gmail.com
Working, but not speaking, for Medtronic
void Void(void) { Void(); } /* The recursive call of the void */
Kaz Kylheku
2024-02-02 18:54:06 UTC
Permalink
Post by bart
The way 'as' works IS rubbish.
Pretend a developer of "as" (say, the GNU one) is reading this thread.

What is it that is broken?

Do you have a minimal repro test case of your issue?

What is the proposed fix?
Post by bart
turn it round and make it about me. There can't possibly be anything
wrong with it, whoever says so must be deluded!
A vast amount of code is being compiled daily, passing through as,
without anyone noticing.
--
TXR Programming Language: http://nongnu.org/txr
Cygnal: Cygwin Native Application Library: http://kylheku.com/cygnal
Mastodon: @***@mstdn.ca
Kaz Kylheku
2024-02-02 16:26:12 UTC
Permalink
Post by bart
disciplined. Mine doesn't have the equivalent of .h files for example.
My musical instrument has frets for easy intonation, you silly violin
people, in your silly violin newsgroup.
--
TXR Programming Language: http://nongnu.org/txr
Cygnal: Cygwin Native Application Library: http://kylheku.com/cygnal
Mastodon: @***@mstdn.ca
Lawrence D'Oliveiro
2024-02-01 23:30:14 UTC
Permalink
Post by David Brown
I am, however, considering CMake (which works at a
higher level, and outputs makefiles, ninja files or other project
files).
Ninja was created as an alternative to Make. Basically, if your Makefiles
are going to be generated by a meta-build system like CMake or Meson, then
they don’t need to support the kinds of niceties that facilitate writing
them by hand. So you strip it write down to the bare-bones functionality,
which makes your builds fast while consuming minimal resources, and that
is Ninja.
Post by David Brown
It appears to have some disadvantages compared to my makefiles,
such as needed to be run as an extra step when files are added or
removed to a project or dependencies are changed, but that doesn't
happen too often, and it's integration with other tools and projects
might make it an overall win.
Some are proposing Meson as an alternative to CMake. I think they are
saying that the fact that its scripting language is not fully Turing-
equivalent is an advantage.

Me, while I think the CMake language can be a little clunky in places, I
still think having Turing-equivalence is better than not having it. ;)
David Brown
2024-02-02 10:05:22 UTC
Permalink
Post by Lawrence D'Oliveiro
Post by David Brown
I am, however, considering CMake (which works at a
higher level, and outputs makefiles, ninja files or other project
files).
Ninja was created as an alternative to Make.
It is an alternative to some uses of make - but by no means all uses.
Post by Lawrence D'Oliveiro
Basically, if your Makefiles
are going to be generated by a meta-build system like CMake or Meson, then
they don’t need to support the kinds of niceties that facilitate writing
them by hand. So you strip it write down to the bare-bones functionality,
which makes your builds fast while consuming minimal resources, and that
is Ninja.
Yes.

It is not normal to write ninja files by hand - the syntax is relatively
simple, but quite limited. So it covers the lower level bits of "make",
but not the higher level bits.


Perhaps ninja is the tool that Bart is looking for? For the kinds of
things he is doing, I don't think it would be hard to write the ninja
files by hand.



So it won't work for my needs, as I want to work at a higher level
(without manually detailing file lists and dependencies).

But if I find that CMake supports all I need at that level, then I
expect I could just as easily generate ninja files as makefiles. The
only issue that I know of is that ninja does not have full jobserver
support, which could be important if the build involves other parallel
tasks (like gcc LTO linking).
Post by Lawrence D'Oliveiro
Post by David Brown
It appears to have some disadvantages compared to my makefiles,
such as needed to be run as an extra step when files are added or
removed to a project or dependencies are changed, but that doesn't
happen too often, and it's integration with other tools and projects
might make it an overall win.
Some are proposing Meson as an alternative to CMake. I think they are
saying that the fact that its scripting language is not fully Turing-
equivalent is an advantage.
Me, while I think the CMake language can be a little clunky in places, I
still think having Turing-equivalence is better than not having it. ;)
For many reasons, CMake is the prime candidate as an alternative to make
for my use.
Malcolm McLean
2024-02-02 00:26:09 UTC
Permalink
Post by bart
It works for me, and I'm sure could work for others if they didn't
have makefiles forced down their throats and hardwired into their brains.
/Nobody/ has makefiles forced on them.  People use "make" because it is
convenient, and it works.  If something better comes along, and it is
better enough to overcome the familiarity momentum, people will use that.
What?
You have total control of your programming environment and never have to
consider anybody else? For hobby programming you do in a way. Not if you
want other people to use your stuff. But can always say that fun of
doing things exactly your way outweighs the fun of getting downloads.

But for professional or academic programming, often you'll find you have
to use make. You don't have a choice. Either someone else took the
decision, or there are so many other people who expect that build shall
be via make that you have no real alternative.

Now in one study, someone had wanted to do a survey of genetic sequence
analysis software. They reported no results for half the programs,
because they had attempted to build them, and failed. They didn't say,
but it's a fair bet that most of those build systems used make. The
software distribution system is a disaster and badly needs fixing.

But there are lots of caveats. Bart's system might be better, but it as
you say it needs traction. I'd be reluctant to evangelise for it and get
everyone to use it at work, because it might prove to have major
drawbacks, and then I'd get the blame. Which I wouldn't if I wrote a
makefile which broke. Not in the same way. And of course one person
can't rigorously test and debug, and buid an ecosystem of ancilliary
tools, dcumentation, resoruces, help meesage boards. However a lot of
things start small, with one lone programmer beavering away in his
bedroom. It's necessary to look at the positives, and not strangle
things at birth.
--
Check out Basic Algorithms and my other books:
https://www.lulu.com/spotlight/bgy1mm
bart
2024-02-02 00:35:23 UTC
Permalink
Post by Malcolm McLean
Post by bart
It works for me, and I'm sure could work for others if they didn't
have makefiles forced down their throats and hardwired into their brains.
/Nobody/ has makefiles forced on them.  People use "make" because it
is convenient, and it works.  If something better comes along, and it
is better enough to overcome the familiarity momentum, people will use
that.
What?
You have total control of your programming environment and never have to
consider anybody else? For hobby programming you do in a way. Not if you
want other people to use your stuff. But can always say that fun of
doing things exactly your way outweighs the fun of getting downloads.
But for professional or academic programming, often you'll find you have
to use make. You don't have a choice. Either someone else took the
decision, or there are so many other people who expect that build shall
be via make that you have no real alternative.
Now in one study, someone had wanted to do a survey of genetic sequence
analysis software. They reported no results for half the programs,
because they had attempted to build them, and failed. They didn't say,
but it's a fair bet that most of those build systems used make. The
software distribution system is a disaster and badly needs fixing.
But there are lots of caveats. Bart's system might be better, but it as
you say it needs traction. I'd be reluctant to evangelise for it and get
everyone to use it at work, because it might prove to have major
drawbacks, and then I'd get the blame.
There's a lite, flexible version of it, which doesn't interfere with any
existing uses of 'make'.

That is to also provide a simple list the C files somewhere, in a
comment, or text files. Plus any other notes needed to build the project
(written in English or Norwegian, I don't care; Norwegian will be decode
to understand than a typical makefile).

This is exactly what you did with the resource compiler, specifying the
three lots of *.c files needed to build it; no makefiles or CMake needed
(which failed if you remember).
David Brown
2024-02-02 10:13:42 UTC
Permalink
Post by Malcolm McLean
Post by bart
It works for me, and I'm sure could work for others if they didn't
have makefiles forced down their throats and hardwired into their brains.
/Nobody/ has makefiles forced on them.  People use "make" because it
is convenient, and it works.  If something better comes along, and it
is better enough to overcome the familiarity momentum, people will use
that.
What?
You have total control of your programming environment and never have to
consider anybody else? For hobby programming you do in a way. Not if you
want other people to use your stuff. But can always say that fun of
doing things exactly your way outweighs the fun of getting downloads.
Okay, none of the people talking about "make" /here/ had it forced on
them for the uses they are talking about /here/.

Yes, I have a very large degree of control over my programming
environment - because I work in a company where employees get to make
the decisions that they are best qualified to make, and management's job
is to support them. One of the important factors I consider is
interaction with colleagues and customers, for which "make" works well.

And while people may be required to use make, or particular compilers,
or OS's, no one is forced to /like/ a tool or find it useful. I believe
that when people here say they like make, or find it works well for
them, or that it can handle lots of different needs, or that they know
of nothing better for their requirements, they are being honest about
that. If they didn't like it, they would say.

The only person here whom we can be absolutely sure does /not/ have
"make" forced upon them for their development, is Bart. And he is the
one who complains about it.
bart
2024-02-02 10:54:22 UTC
Permalink
Post by David Brown
Post by Malcolm McLean
Post by bart
It works for me, and I'm sure could work for others if they didn't
have makefiles forced down their throats and hardwired into their brains.
/Nobody/ has makefiles forced on them.  People use "make" because it
is convenient, and it works.  If something better comes along, and it
is better enough to overcome the familiarity momentum, people will
use that.
What?
You have total control of your programming environment and never have
to consider anybody else? For hobby programming you do in a way. Not
if you want other people to use your stuff. But can always say that
fun of doing things exactly your way outweighs the fun of getting
downloads.
Okay, none of the people talking about "make" /here/ had it forced on
them for the uses they are talking about /here/.
Yes, I have a very large degree of control over my programming
environment - because I work in a company where employees get to make
the decisions that they are best qualified to make, and management's job
is to support them.  One of the important factors I consider is
interaction with colleagues and customers, for which "make" works well.
And while people may be required to use make, or particular compilers,
or OS's, no one is forced to /like/ a tool or find it useful.  I believe
that when people here say they like make, or find it works well for
them, or that it can handle lots of different needs, or that they know
of nothing better for their requirements, they are being honest about
that.  If they didn't like it, they would say.
The only person here whom we can be absolutely sure does /not/ have
"make" forced upon them for their development, is Bart.  And he is the
one who complains about it.
Not for my own development, no. Unless that includes having to build
external dependenceies from source, which are written in C.

Or just things I want to test my C compiler on.

If I want to build Seed7, for example, that comes with 19 different
makefiles. LibJPEG has 15 different makefiles. GMP has one makefiles,
but a 30,000-line configure script dependent on Linux.

I could and have spent a lot of time on many of those in manually
discovering the C files necessary to building the project.

Once done, the process was beautifully streamlined and simple.

But I know this a waste of time and nobody's mind is going to be changed.
Malcolm McLean
2024-02-02 14:15:39 UTC
Permalink
Post by David Brown
Post by Malcolm McLean
Post by bart
It works for me, and I'm sure could work for others if they didn't
have makefiles forced down their throats and hardwired into their brains.
/Nobody/ has makefiles forced on them.  People use "make" because it
is convenient, and it works.  If something better comes along, and it
is better enough to overcome the familiarity momentum, people will
use that.
What?
You have total control of your programming environment and never have
to consider anybody else? For hobby programming you do in a way. Not
if you want other people to use your stuff. But can always say that
fun of doing things exactly your way outweighs the fun of getting
downloads.
Okay, none of the people talking about "make" /here/ had it forced on
them for the uses they are talking about /here/.
Yes, I have a very large degree of control over my programming
environment - because I work in a company where employees get to make
the decisions that they are best qualified to make, and management's job
is to support them.  One of the important factors I consider is
interaction with colleagues and customers, for which "make" works well.
And while people may be required to use make, or particular compilers,
or OS's, no one is forced to /like/ a tool or find it useful.  I believe
that when people here say they like make, or find it works well for
them, or that it can handle lots of different needs, or that they know
of nothing better for their requirements, they are being honest about
that.  If they didn't like it, they would say.
The only person here whom we can be absolutely sure does /not/ have
"make" forced upon them for their development, is Bart.  And he is the
one who complains about it.
My job is to write the algorithms. Not to set uo the build system.
Someone else was given that job and set up the system I described
recently, with git and conan and Azure. He didn't do a bad job at all
and we can get a bug fix out to customers within hours on request. My
input is just to moan when occasionally things go wrong. Had I done it
I'm sure it would have been a lot worse, because basically my skills are
in algorithm development, not setting things up like that.

It's makefile free, thank goodness. My main peeve is boost. When things
go wrong, it always seems to be the boost. I've refused to include it in
my library, despite requests. Whilst it would be nice to have the
threads, I just think it would be a perpetual source of build failures
and grief. That's from experience of boost in other projects. I might
ultimately have to give in. I don't have total control, at the end of
the day, it's not my personal code, it's company code.

Because I'm constantly shifting between platforms, make isn't veey
useful to me. And because mainly I do algorithms, the guts of it are
simple source files with no dependencies other than the standard
libraries. So you don't need elaborates systems for pulling things in.
It's just submit a list of sources to the compiler. And you don't need
make to do that. So I don't use make much, for reasons other than that I
don't particularly like it. CMake is a sort of front end to make, which
says it all really. But CMake can also spin uo an IDE project file. And
if you are developing rather than just building, that's far more
convenient. So I distribute using CMake in preference to make. But if
someone won't accept CMake, then I'd have no hesitation, and drop down
to make.
--
Check out Basic Algorithms and my other books:
https://www.lulu.com/spotlight/bgy1mm
Janis Papanagnou
2024-02-02 00:46:36 UTC
Permalink
I've nothing against shorter or simpler makefiles. [...]
During mid/late 1990's someone at our site looked for an alternative
to Make. After some evaluation of tools it was decided to not replace
Make. I've just googled for what at that time appeared to be the most
promising candidate (it's obviously still there) and the description
of Jam reads as it would fulfill some of the requirements that have
been mentioned by various people here (see https://freetype.org/jam/
for details).

Janis
Kaz Kylheku
2024-02-01 16:20:24 UTC
Permalink
Post by David Brown
5. Modules provide encapsulation of data, code and namespaces.
Case study: C++ originally had only classes which provie this. Then it
acquired the namespace construct which also provides it.. In spite of
that, someone decided it needs modules also.
--
TXR Programming Language: http://nongnu.org/txr
Cygnal: Cygwin Native Application Library: http://kylheku.com/cygnal
Mastodon: @***@mstdn.ca
Lawrence D'Oliveiro
2024-02-01 21:34:31 UTC
Permalink
Post by David Brown
2. You can compile modules independently to allow partial builds.
In our Comp Sci classes we were careful to draw a distinction between
“separate” and “independent” compilation. The latter is exemplified by
(old-style) Fortran and C, where the same name may be declared in multiple
units, and the linker will happily tie them together, but without any
actual checking that the declarations match.

“Separate” compilation, on the other hand, means that there is some
consistency checking done between the declarations, and the program will
fail to build if there are mismatches. Ada has this. And it looks like
Fortran has acquired it, too, since the Fortran 90 spec.
Richard Harnden
2024-02-01 16:09:53 UTC
Permalink
Post by bart
BTW that 'make' only works on my machine because it happens to be part
of mingw; none of my other C compilers have make.
And as written, it only works for 'cc' which comes with 'gcc'
Doesn't dos/windows have nmake and cl?
bart
2024-02-01 17:32:01 UTC
Permalink
Post by Richard Harnden
Post by bart
BTW that 'make' only works on my machine because it happens to be part
of mingw; none of my other C compilers have make.
And as written, it only works for 'cc' which comes with 'gcc'
Doesn't dos/windows have nmake and cl?
No.
Scott Lurndal
2024-02-01 19:25:12 UTC
Permalink
Post by Richard Harnden
Post by bart
BTW that 'make' only works on my machine because it happens to be part
of mingw; none of my other C compilers have make.
And as written, it only works for 'cc' which comes with 'gcc'
Doesn't dos/windows have nmake and cl?
No.
You sure about that? They sure used to have them
as an add-on. IIRC, they're still part of visual studio.
bart
2024-02-01 19:51:53 UTC
Permalink
Post by Scott Lurndal
Post by Richard Harnden
Post by bart
BTW that 'make' only works on my machine because it happens to be part
of mingw; none of my other C compilers have make.
And as written, it only works for 'cc' which comes with 'gcc'
Doesn't dos/windows have nmake and cl?
No.
You sure about that? They sure used to have them
as an add-on. IIRC, they're still part of visual studio.
Visual Studio is a 10,000MB monster. It might well have it around, but
it's so complex, it's been years since I've even seen discrete cl.exe
and link.exe programs, despite scouring massive, 11-deep directory
structures.

Meanwhile my everyday compilers are 0.4MB for my language and 0.3MB for C.

I like to keep things simple. Everybody else likes to keep things
complicated, and the more the better.

Anyway, acquiring VS just to build one small program would be like just
a giant sledgehammer, 1000 times normal size, to crack a tiny nut.
Chris M. Thomasson
2024-02-01 20:12:59 UTC
Permalink
Post by bart
Post by Richard Harnden
Post by bart
BTW that 'make' only works on my machine because it happens to be part
of mingw; none of my other C compilers have make.
And as written, it only works for 'cc' which comes with 'gcc'
Doesn't dos/windows have nmake and cl?
No.
You sure about that?  They sure used to have them
as an add-on.  IIRC, they're still part of visual studio.
Visual Studio is a 10,000MB monster.
Shit happens. I still use MSVC, quite a lot actually. I install
everything! ;^) Have the space, so, well, okay. ;^)
Post by bart
It might well have it around, but
it's so complex, it's been years since I've even seen discrete cl.exe
and link.exe programs, despite scouring massive, 11-deep directory
structures.
Meanwhile my everyday compilers are 0.4MB for my language and 0.3MB for C.
I like to keep things simple. Everybody else likes to keep things
complicated, and the more the better.
Anyway, acquiring VS just to build one small program would be like just
a giant sledgehammer, 1000 times normal size, to crack a tiny nut.
Chris M. Thomasson
2024-02-01 20:43:51 UTC
Permalink
Post by Chris M. Thomasson
Post by bart
Post by Richard Harnden
Post by bart
BTW that 'make' only works on my machine because it happens to be part
of mingw; none of my other C compilers have make.
And as written, it only works for 'cc' which comes with 'gcc'
Doesn't dos/windows have nmake and cl?
No.
You sure about that?  They sure used to have them
as an add-on.  IIRC, they're still part of visual studio.
Visual Studio is a 10,000MB monster.
Shit happens. I still use MSVC, quite a lot actually. I install
everything! ;^) Have the space, so, well, okay. ;^)
The fat bastard wants me to update to version (17.8.6). I currently have
(17.8.5):

:^)



Ham On! LOL! ;^)
Post by Chris M. Thomasson
Post by bart
It might well have it around, but it's so complex, it's been years
since I've even seen discrete cl.exe and link.exe programs, despite
scouring massive, 11-deep directory structures.
Meanwhile my everyday compilers are 0.4MB for my language and 0.3MB for C.
I like to keep things simple. Everybody else likes to keep things
complicated, and the more the better.
Anyway, acquiring VS just to build one small program would be like
just a giant sledgehammer, 1000 times normal size, to crack a tiny nut.
Michael S
2024-02-01 20:36:47 UTC
Permalink
On Thu, 1 Feb 2024 19:51:53 +0000
Post by bart
Post by Scott Lurndal
Post by Richard Harnden
Post by bart
BTW that 'make' only works on my machine because it happens to
be part of mingw; none of my other C compilers have make.
And as written, it only works for 'cc' which comes with 'gcc'
Doesn't dos/windows have nmake and cl?
No.
You sure about that? They sure used to have them
as an add-on. IIRC, they're still part of visual studio.
Visual Studio is a 10,000MB monster. It might well have it around,
but it's so complex, it's been years since I've even seen discrete
cl.exe and link.exe programs, despite scouring massive, 11-deep
directory structures.
If you only download command-line build tools then it's somewhat less
huge.
2022 version is 3,152,365,436 bytes.
I don't know the size of installation package. It looks like on my home
PC I used online installer.
Post by bart
Meanwhile my everyday compilers are 0.4MB for my language and 0.3MB for C.
I like to keep things simple. Everybody else likes to keep things
complicated, and the more the better.
Anyway, acquiring VS just to build one small program would be like
just a giant sledgehammer, 1000 times normal size, to crack a tiny
nut.
David Brown
2024-02-01 22:09:48 UTC
Permalink
Post by Richard Harnden
Post by bart
BTW that 'make' only works on my machine because it happens to be part
of mingw; none of my other C compilers have make.
And as written, it only works for 'cc' which comes with 'gcc'
Doesn't dos/windows have nmake and cl?
Those are part of MSVC, which runs on Windows but does not come with it.
"nmake" is MS's version of "make", and has been shipped with most MS
development tools for many decades.
Lawrence D'Oliveiro
2024-02-01 23:32:46 UTC
Permalink
"nmake" is MS's version of "make" ...
I think they did originally have a tool called “make”. But this was so
crap in comparison to the GNU/POSIX equivalent that they changed the name
in the new version to try to distance themselves from the bad taste the
old version left in people’s mouths.
Lawrence D'Oliveiro
2024-01-31 21:17:37 UTC
Permalink
Post by vallor
$ make -j
The last time I tried that on an FFmpeg build, it brought my machine to
its knees. ;)
Nicolas George
2024-01-31 22:28:45 UTC
Permalink
Post by Lawrence D'Oliveiro
The last time I tried that on an FFmpeg build, it brought my machine to
its knees. ;)
But at least, with FFmpeg's build system, parallel build works.
Lawrence D'Oliveiro
2024-02-01 00:27:52 UTC
Permalink
Post by Nicolas George
Post by Lawrence D'Oliveiro
Post by vallor
$ make -j
The last time I tried that on an FFmpeg build, it brought my machine to
its knees. ;)
But at least, with FFmpeg's build system, parallel build works.
I still feel that the “-j” option is a bit dangerous: the fact that you
can specify it with an argument or without means it is too easy to tell it
to use unlimited numbers of processes.

Tip: I often use

make -j$(nproc)

though I’m told you may want to add 1 to this.
Scott Lurndal
2024-02-01 01:28:27 UTC
Permalink
Post by Nicolas George
Post by Lawrence D'Oliveiro
Post by vallor
$ make -j
The last time I tried that on an FFmpeg build, it brought my machine to
its knees. ;)
But at least, with FFmpeg's build system, parallel build works.
I still feel that the “-j” option is a bit dangerous: the fact that you
can specify it with an argument or without means it is too easy to tell it
to use unlimited numbers of processes.
Tip: I often use
make -j$(nproc)
though I’m told you may want to add 1 to this.
I use $(( 2 * nproc )) for builds, which are often iobound.
David Brown
2024-02-01 08:48:45 UTC
Permalink
Post by Lawrence D'Oliveiro
Post by vallor
$ make -j
The last time I tried that on an FFmpeg build, it brought my machine to
its knees. ;)
Sometimes "make -j" can be a bit enthusiastic about the number of
processes it starts. If there are many operations it /could/ do, trying
to run them all can chew through a lot more memory than you'd like. I
usually use something like "make -j 8", though the ideal number of
parallel tasks depends on the number of cpu cores you have, their type
(SMT threads or real cores, "big" cores or "little" cores), memory,
speed of disks, additional tools like ccache or distcc, etc.

I'd rather "make -j" (without a number) defaulted to using the number of
cpu cores, as that is a reasonable guess for most compilations.
Keith Thompson
2024-02-01 19:49:36 UTC
Permalink
David Brown <***@hesbynett.no> writes:
[...]
Post by David Brown
I'd rather "make -j" (without a number) defaulted to using the number
of cpu cores, as that is a reasonable guess for most compilations.
Agreed, but there might not be a sufficiently portable way to determine
that number.
--
Keith Thompson (The_Other_Keith) Keith.S.Thompson+***@gmail.com
Working, but not speaking, for Medtronic
void Void(void) { Void(); } /* The recursive call of the void */
Lawrence D'Oliveiro
2024-02-01 21:39:40 UTC
Permalink
Post by Keith Thompson
Post by David Brown
I'd rather "make -j" (without a number) defaulted to using the number
of cpu cores, as that is a reasonable guess for most compilations.
Agreed, but there might not be a sufficiently portable way to determine
that number.
nproc(1) is part of the GNU Core Utilities
<manpages.debian.org/1/nproc.1.html>.
Keith Thompson
2024-02-01 23:24:00 UTC
Permalink
Post by Lawrence D'Oliveiro
Post by Keith Thompson
Post by David Brown
I'd rather "make -j" (without a number) defaulted to using the number
of cpu cores, as that is a reasonable guess for most compilations.
Agreed, but there might not be a sufficiently portable way to determine
that number.
nproc(1) is part of the GNU Core Utilities
<manpages.debian.org/1/nproc.1.html>.
And GNU make is not, so it's possible that a system might have make but
not nproc. Also, nproc was added to GNU Coreutils in 2009, and the
current meaning of "make -j" with no numeric argument was defined before
that.

A new "-J" option that means "-j $(nproc)" might be useful, but it's
easy enough to use "make -j $(nproc)".
--
Keith Thompson (The_Other_Keith) Keith.S.Thompson+***@gmail.com
Working, but not speaking, for Medtronic
void Void(void) { Void(); } /* The recursive call of the void */
Lawrence D'Oliveiro
2024-02-01 23:38:17 UTC
Permalink
Post by Keith Thompson
Post by Lawrence D'Oliveiro
nproc(1) is part of the GNU Core Utilities
<manpages.debian.org/1/nproc.1.html>.
And GNU make is not, so it's possible that a system might have make but
not nproc.
While that is theoretically possible, I somehow think such an installation
would feel to the typical *nix user somewhat ... crippled.

Particularly since the “install” command is part of coreutils.

Also imagine trying to do builds, or any kind of development, on a system
without the “mkdir” command--another component of coreutils.
Kaz Kylheku
2024-02-01 23:53:03 UTC
Permalink
Post by Lawrence D'Oliveiro
Post by Keith Thompson
Post by Lawrence D'Oliveiro
nproc(1) is part of the GNU Core Utilities
<manpages.debian.org/1/nproc.1.html>.
And GNU make is not, so it's possible that a system might have make but
not nproc.
While that is theoretically possible, I somehow think such an installation
would feel to the typical *nix user somewhat ... crippled.
Selected GNU programs can be individually installed on Unix-like systems
which already have other tools of their own.
Post by Lawrence D'Oliveiro
Particularly since the “install” command is part of coreutils.
The install utility appeared in 4.2 BSD, which was released in
August 1983.

The GNU Project was announced in September 1983.
--
TXR Programming Language: http://nongnu.org/txr
Cygnal: Cygwin Native Application Library: http://kylheku.com/cygnal
Mastodon: @***@mstdn.ca
David Brown
2024-02-01 22:14:05 UTC
Permalink
Post by Keith Thompson
[...]
Post by David Brown
I'd rather "make -j" (without a number) defaulted to using the number
of cpu cores, as that is a reasonable guess for most compilations.
Agreed, but there might not be a sufficiently portable way to determine
that number.
gcc manages to figure it out for parallel tasks, such as LTO linking. I
think it would be reasonable enough to have it use the number of cores
when it was able to figure it out, and a default (say, 4) when it could not.
Loading...