the Lilith workstation, and so much more. It is impossible
to overstate his importance to modern computing.
tenser wrote to All <=-
RIP Niklaus Wirth, 1934-2024.
Wirth was the inventor of the Algol/W, Pascal, Modula-2, and
Oberon programming languages, the Oberon operating system,
the Lilith workstation, and so much more. It is impossible
to overstate his importance to modern computing.
RIP Niklaus Wirth, 1934-2024.
Wirth was the inventor of the Algol/W, Pascal, Modula-2, and
Oberon programming languages, the Oberon operating system,
the Lilith workstation, and so much more. It is impossible
to overstate his importance to modern computing.
We have lost a giant.
RIP Niklaus Wirth, 1934-2024.
Anyone who studied CS in the 90s on probably started off with Pascal -
not sure about nowadays.
Anyone who studied CS in the 90s on probably started off with Pascal - not
sure about nowadays.
Certainly accurate in my case, at the very end of the 90s. I loved Pascal back then and really couldn't see why people preferred C. I think I switched to C when I started using Linux - GCC was a lot easier to lay my hands on than... whatever the good Pascal compiler was!
RIP Niklaus Wirth, 1934-2024.
Certainly accurate in my case, at the very end of the 90s. I loved
Pascal back then and really couldn't see why people preferred C. I think
I switched to C when I started using Linux - GCC was a lot easier to lay my hands on than... whatever the good Pascal compiler was!
RIP Niklaus Wirth, 1934-2024.
Wirth was the inventor of the Algol/W, Pascal, Modula-2, and
Oberon programming languages, the Oberon operating system,
the Lilith workstation, and so much more. It is impossible
to overstate his importance to modern computing.
We have lost a giant.
--- Mystic BBS v1.12 A48 (Linux/64)
* Origin: Agency BBS | Dunedin, New Zealand | agency.bbs.nz (21:1/101)
Bob Worm wrote to poindexter FORTRAN <=-
Certainly accurate in my case, at the very end of the 90s. I loved
Pascal back then and really couldn't see why people preferred C.
Wirth was the inventor of the Algol/W, Pascal, Modula-2, and
Oberon programming languages, the Oberon operating system,
the Lilith workstation, and so much more. It is impossible
to overstate his importance to modern computing.
Anyone who studied CS in the 90s on probably started off with Pascal -
not sure about nowadays.
Nightfox wrote to Bob Worm <=-
I took a few CS programming classes in college in 1999-2000, and they
were teaching C++. I ended up going into a software engineering
program. I never had any Pascal classes in college.
esc wrote to Bob Worm <=-
I'd still like to go learn Pascal simply because of how prominent it
was in all things BBSing, and the amount of Pascal sources related to BBSes are out there floating around. I wonder how long it'd take.
Dr. What wrote to Bob Worm <=-
Because of Turbo Pascal. Pascal was designed to be an efficient, one
pass compile, language and it lived up to it. Turbo Pascal was a great tool that produced efficient code.
I took a few CS programming classes in college in 1999-2000, and they
were teaching C++. I ended up going into a software engineering program.
I never had any Pascal classes in college.
That's really strange. Most CS curricula when I was starting out had a structured language class as the first class, it was sort of a weed-out class and taught in Pascal. They later switched (in my case) to ANSI C. This was before c++.
Tiny, too.
(From https://prog21.dadgum.com/116.html)
The entire Turbo Pascal 3.02 executable--the compiler and IDE--was
39,731 bytes. How does that stack up in 2011 terms? Here are some things that Turbo Pascal is smaller than, as of October 30, 2011:
I'd still like to go learn Pascal simply because of how prominent it was in all things BBSing, and the amount of Pascal sources related to BBSes are out there floating around. I wonder how long it'd take.
We were still required to take BASIC, with Pascal offered as an elective, but I suspect that the school I went to was a little behind the times.
We were still required to take BASIC, with Pascal offered as anelective,
but I suspect that the school I went to was a little behind thetimes.
Mine was much the same. Basic, touching on other stuff.. pascal was probably
the most likely, the rest just rating mentions. I suspect it reflects
more
on the age of the people writing the texts at the time. This is mid
to late 80s...
Blue White wrote to poindexter FORTRAN <=-
We were still required to take BASIC, with Pascal offered as an
elective, but I suspect that the school I went to was a little behind
the times.
Ben Collver wrote to poindexter FORTRAN <=-
I learned Pascal because that's what they taught at school.
My first PC had no fixed disk, only 5-1/4" floppies
It was important for software to be small like this.
Nightfox wrote to poindexter FORTRAN <=-
That's interesting.. For the software engineering program I went into later, for the first term, they didn't even have us do any programming, but they went over software design concepts (including object-oriented design), and the next term, they went right into C++.
That makes sense, the PASCAL class was used mostly to teach data
structures and algorithms, then everything else was in C, assembler or
LISP. In retrospect I'd rather just jump into C or C++ instead of
spending time learning another language.
That makes sense, the PASCAL class was used mostly to teach data structures and algorithms, then everything else was in C, assembler or LISP. In retrospect I'd rather just jump into C or C++ instead of
spending time learning another language.
tenser wrote to poindexter FORTRAN <=-
Honestly, at this point, I can't think of a good reason
to teach C at the collegiate level. Intro classes should
arguably be in a functional language of some kind; I like
Scheme, but Racket would be better; barring that, OCaml
or even SML would work well.
For low-level details, I'd teach assembler on RISC-V, and
then follow up with Rust or Go; maybe Zig.
Things change over time. Universities used to teach COBOL
to CS students; now they don't. Pascal was favored for a
time, now it isn't. Java even made a play for a while. C
is similar.
--- Mystic BBS v1.12 A48 (Linux/64)
* Origin: Agency BBS | Dunedin, New Zealand | agency.bbs.nz (21:1/101)
poindexter FORTRAN wrote to Nightfox <=-
That makes sense, the PASCAL class was used mostly to teach data structures and algorithms, then everything else was in C, assembler or LISP. In retrospect I'd rather just jump into C or C++ instead of
spending time learning another language.
That's interesting.. For the software engineering program I went into
later, for the first term, they didn't even have us do any programming,
but they went over software design concepts (including object-oriented
design), and the next term, they went right into C++.
That makes sense, the PASCAL class was used mostly to teach data structures and algorithms, then everything else was in C, assembler or LISP. In retrospect I'd rather just jump into C or C++ instead of spending time learning another language.
Honestly, at this point, I can't think of a good reason to teach C at the collegiate level. Intro classes should arguably be in a functional language of some kind; I like Scheme, but Racket would be better; barring that, OCaml or even SML would work well.
tenser wrote to poindexter FORTRAN <=-
Honestly, at this point, I can't think of a good reason
to teach C at the collegiate level. Intro classes should
arguably be in a functional language of some kind; I like
Scheme, but Racket would be better; barring that, OCaml
or even SML would work well.
It is in much more common use than any of the other languages, so
having some marketable experience is a good thing.
For low-level details, I'd teach assembler on RISC-V, and
then follow up with Rust or Go; maybe Zig.
I learned assembler on a VAX. Nice instruction set, but we only had a handful of serial ports for a ton of students. I became a night-owl that semester, logging on from 2-6am to get my work done.
Things change over time. Universities used to teach COBOL
to CS students; now they don't. Pascal was favored for a
time, now it isn't. Java even made a play for a while. C
is similar.
My university started a business/CS degree, they learned COBOL, PC databases and the like. It might have been a more useful degree for me than my "hard" CS degree.
poindexter FORTRAN wrote to Nightfox <=-
That makes sense, the PASCAL class was used mostly to teach data structures and algorithms, then everything else was in C, assembler o LISP. In retrospect I'd rather just jump into C or C++ instead of spending time learning another language.
Pascal was intended to be a teaching language that taught you good programming practice. So that when you got to C/C++, and didn't have
the seatbelts that Pascal gave you, you didn't kill yourself when your program crashed.
Re: Re: RIP Niklaus Wirth
By: tenser to poindexter FORTRAN on Mon Jan 08 2024 03:51 am
Honestly, at this point, I can't think of a good reason to teach C at collegiate level. Intro classes should arguably be in a functional language of some kind; I like Scheme, but Racket would be better; bar that, OCaml or even SML would work well.
I haven't heard of Scheme, Racket, OCaml or SML.. Are those fairly new languages?
I think one of the reasons for teaching C (or C++), at least
at the time, was that those languages were in fairly wide use in the industry. I've been hearing they're being used less, but at the same time, they're still fairly popular languages due to the amount of older code in existence. And I've heard C is especially popular for embedded software.
I took a few CS programming classes in college in 1999-2000, and they
were teaching C++. I ended up going into a software engineering
program. I never had any Pascal classes in college.
tenser wrote to Dr. What <=-
It was always easy to pick out the kids who'd been
exposed to COBOL and then learned C; their C code
tended to be overly verbose and not terribly idiomatic.
It'd take them a good while to come up to speed.
The ones who came from BASIC had it the worst, though.
tenser wrote to poindexter FORTRAN <=-
Yeah, a lot of MIS/CIS places continued to teach COBOL for
a long time. I wonder what they teach now....
tenser wrote to Dr. What <=-
It was always easy to pick out the kids who'd been
exposed to COBOL and then learned C; their C code
tended to be overly verbose and not terribly idiomatic.
It'd take them a good while to come up to speed.
hollowone wrote to Nightfox <=-
Some tried Delphi, but only for some DB apps.. Delphi never was as universal as Borland/Turbo Pascal could be. I believe one of the
reasons of Borland's demise, shortly after.
tenser wrote to Dr. What <=-
It was always easy to pick out the kids who'd been
exposed to COBOL and then learned C; their C code
tended to be overly verbose and not terribly idiomatic.
It'd take them a good while to come up to speed.
For me, the "tell" was no local variables and functions had a huge
number of parameters and sub-functions. But I think that goes with the "not terribly idiomatic" category.
Did you know that 'lint' crashes if you have too many local variables?
I didn't until I had to run some C-COBOL through it.
The ones who came from BASIC had it the worst, though.
It depends, but you're mostly right.
I learned BASIC on my own. But by the end of High School, I had already "discovered" Structured Programming (before I even knew who Edsger Dijkstra was), using GOSUB and minimizing GOTOs. Commodore BASIC didn't have WHILE/WEND loops until much later.
tenser wrote to poindexter FORTRAN <=-
Yeah, a lot of MIS/CIS places continued to teach COBOL for
a long time. I wonder what they teach now....
If I were freelancing, I wouldn't mind having a business process control language like COBOL or RPG in my toolkit. I wouldn't want to rely on it, mind you...
I learned BASIC on my own. But by the end of High School, I had already "discovered" Structured Programming (before I even knew who Edsger Dijkstra was), using GOSUB and minimizing GOTOs. Commodore BASIC didn't have WHILE/WEND loops until much later.
They're still used quite a bit, but with memory safety coming up as a big issue, and for that matter with the US government starting to introduce legislation to limit the use of memory unsafe languages in government, their use is likely to decline faster (at least as usually written).
(And yes, Congress has recently introduced a bill directing the DoD to come up with a plan to limit the use of memory unsafe languages like C and C++.)
I took a few CS programming classes in college in 1999-2000, and they
were teaching C++. I ended up going into a software engineering program.
I never had any Pascal classes in college.
I think that's the time where all schools successfuly adopted Windows and dropped MSDOS in its curriculum. Pascal (and Borland's compilers and tools specifically) was dominant in MS-DOS times.
Re: Re: RIP Niklaus Wirth
By: tenser to Nightfox on Mon Jan 08 2024 04:06 pm
They're still used quite a bit, but with memory safety coming up as a issue, and for that matter with the US government starting to introdu legislation to limit the use of memory unsafe languages in government their use is likely to decline faster (at least as usually written).
(And yes, Congress has recently introduced a bill directing the DoD t come up with a plan to limit the use of memory unsafe languages like C++.)
Interesting.. For C++, there have been some new things introduced to the standard library to help with dynamic memory management. Some such
things are helper classes unique_ptr, shared_ptr, and similar, which manage a dynamically-allocated object for you (i.e., the memory will be freed when the helper object goes out of scope and it's the last to reference the memory, etc.). I've even seen some C++ developers say
they don't have to write a 'delete' statement anymore.
I live near Santa Cruz, and drive by the old Borland building occasionally, Seagate's old office (the address was 1 disk drive...) and where I took a UNIX training class at SCO.
A friend of mine went sailing on Philippe Kahn's sailboat a few years
ago, just before Covid. Apparently life after Borland is treating him well.
One thing that I've found odd is that in some programming forums on Facebook & such, there have been some students asking questions and
saying their instructor was having them use Borland Turbo C++ for DOS - And this was well into the 2010s.
time, now it isn't. Java even made a play for a while. C
time, now it isn't. Java even made a play for a while. C
Yup, Java was the first programming language they taught to us. I don't miss it at all.
Yup, Java was the first programming language they taught to us. I don't miss it at all.
Maybe the course was "History of programming" :)
Or something retro. Some ghost-haunted teachers on universities in my country also introduce retro languages and tools for fun. Especially if they were demosceners in the past. I know one who runs a course how to make visual effects on Amiga this year!
tenser wrote to Dr. What <=-
Yeah. A lot of folks who self-taught themselves BASIC
on 8 and 16 bit micros picked up better practices, but
they were in such a constrained environment it was hard
for them take advantage of more advanced structures.
Bob Worm wrote to Dr. What <=-
Leaving to university still not really sure why GOTOs were bad was probably not an ideal starting position!
poindexter FORTRAN wrote to tenser <=-
If I were freelancing, I wouldn't mind having a business process
control language like COBOL or RPG in my toolkit. I wouldn't want to
rely on it, mind you...
poindexter FORTRAN wrote to tenser <=-
I suppose learning Pascal helped - subroutines defined first, no
GOTOs...
All the coding I did in college and the only code I have remaining from that time is the original DOS batch file for my BBS, all errorlevel
jumps and GOTOs...
tenser wrote to poindexter FORTRAN <=-
Hey, with IBM pushing AI to translate those piles of
dusty COBOL into Java, maybe you'd never need it!
Nightfox wrote to tenser <=-
reference the memory, etc.). I've even seen some C++ developers say
they don't have to write a 'delete' statement anymore.
Nightfox wrote to hollowone <=-
One thing that I've found odd is that in some programming forums on Facebook & such, there have been some students asking questions and
saying their instructor was having them use Borland Turbo C++ for DOS - And this was well into the 2010s.
tenser wrote to poindexter FORTRAN <=-
Hey, with IBM pushing AI to translate those piles of
dusty COBOL into Java, maybe you'd never need it!
Nightfox wrote to unixl0rd <=-
As a junior in college, I took a class about Java server-side
programming with Java servlets & such. I remember the teacher
mentioning a couple of Java functions, activate() and passivate(), and
one of the students said "passivate? To make passive..?" :P
Dr. What wrote to Bob Worm <=-
When I was in high school, by the time they formally taught
programming, there were a number of us who knew more than the teacher.
The teacher was smart and made friends with us and we helped things
along in the computer lab.
Dr. What wrote to Nightfox <=-
Colleges are often behind the times on techology. Mostly due to money.
When I got to college, my classes were on a Univac 1100/80 and during crunch time, the computer could actually lose your parameters between
the main program and the subroutine. Very frustrating - but got you to not wait until crunch time to do your assignment.
For those of who worked on systems who's memory (RAM and Disk) were measured in K, we have a habit of asking those questions.
Hey, with IBM pushing AI to translate those piles of
dusty COBOL into Java, maybe you'd never need it!
When I was in high school, by the time they formally taught programming, there were a number of us who knew more than the teacher. The teacher was smart and made friends with us and we helped things along in the computer lab.
Later, I realized he had a consulting gig and was using us as junior
talent and charging for our time. We got a A+ grades in the class.
One thing that I've found odd is that in some programming forums on
Facebook & such, there have been some students asking questions and
saying their instructor was having them use Borland Turbo C++ for DOS -
And this was well into the 2010s.
Colleges are often behind the times on techology. Mostly due to money.
That's truly amazing to me. I have been hunting around for any kind of "introduction to demoscene coding" type content for months now and found very little online. I'd love to see lectures like this - demoscene techniques presented by someone who teaches for a living...
That's truly amazing to me. I have been hunting around for any kind of
"introduction to demoscene coding" type content for months now and fou
very little online. I'd love to see lectures like this - demoscene techniques presented by someone who teaches for a living...
I have good news and bad news for you.
Good news that the course I've mentioned is online: https://www.youtube.com/watch?v=JNBI3LKWMsM&list=PL-uCI5sq2RyT-WPH8NqA0bd4Q
F0sx0
Bad news is that this coder is Polish and his course is also available only in Polish language. Not sure if YT offers captions from PL to ENG though.
I have good news and bad news for you.
Good news that the course I've mentioned is online: https://www.youtube.com/ watch?v=JNBI3LKWMsM&list=PL-uCI5sq2RyT-WPH8NqA0bd4Q9oN F0sx0
Bad news is that this coder is Polish and his course is also available only in Polish language. Not sure if YT offers captions from PL to ENG though.
tenser wrote to poindexter FORTRAN <=-
Hey, with IBM pushing AI to translate those piles of
dusty COBOL into Java, maybe you'd never need it!
It would be interesting to see that Java code.
COBOL primarily runs on IBM (or compatible) mainframes. Those
mainframes are, in effect, COBOL machines.
I remember doing assembly language on them and was surprised when I
found an assembly command to add 2 packed numbers.
(For those who don't know, "packed" is something like BCD. The number 1234 would be "packed" into 3 bytes of "01", "23" and "4C" (the "C" is a sign.)
Every other assembly language I ever encountered only worked on binary format numbers.
tenser wrote to poindexter FORTRAN <=-
Hey, with IBM pushing AI to translate those piles of
dusty COBOL into Java, maybe you'd never need it!
I don't know, someone needs to run the AI, right? At least for the time being...
My son is about to graduate with a marketing communications degree. I don't think AI is going to render him unemployable, I think the jobs are going to change. There won't be tons of human content creators, rather a handful of people who can effectively leverage AI to create content.
Ditto for coding. If one person managing a LLM can replace a small team, people can focus on program logic and let the LLM do the heavy lifting - the same way that computers allowed mathmeticians to stop doing the repetitive calculations and focus on the big picture.
The class and the language are history, but the handle lives on...
Hey, with IBM pushing AI to translate those piles of
dusty COBOL into Java, maybe you'd never need it!
I worked in a shop that was trying to replace COBOL with Java. They
found that, while Java did well at replacing "screen" programs, they ran into difficulties when it came to some of the complex mathematics
and other heavy processing that happened during their nightly batch processing.
On the same machine, an IBM mainframe, the dusty COBOL code outperformed what they were trying to replace it with.
AI was not the translator in this case... it was a team of COBOL and java developers... but our experience with AI of the time was similar.
My experience with COBOL vs. what we called "distributed" developers was that the latter didn't like COBOL because (1) it didn't have the same "frameworks" that would fill in the code for them (because they couldn't code whatever language they were supposedly coding without help), and (2) if they could code their language on their own, whatever they were coding would be difficult for others to read, understand, or maintain (job security).
My sense is that Java programmers favor a style that is heavy on frameworks, design patterns, and abstraction, often to the point of excess. This yields all sorts of efficiency problems, among other things.
poindexter FORTRAN wrote to Dr. What <=-
And tenured professors. I recall having CS professors telling what it
was like in the "Real World", when I was finishing up my classes (and
coding in the real world). What they remembered wasn't what I was
experiencing.
Bob Worm wrote to Dr. What <=-
Perhaps as an adult I should try to learn some good practices? My code
is very utilitarian - gets the job of the day done but, man, nobody
would want to work on it with me!
Nightfox wrote to Dr. What <=-
Colleges are often behind the times on techology. Mostly due to money.
20+ years behind the times though?
Blue White wrote to tenser <=-
I worked in a shop that was trying to replace COBOL with Java. They
found that, while Java did well at replacing "screen" programs, they
ran into difficulties when it came to some of the complex mathematics
and other heavy processing that happened during their nightly batch processing.
On the same machine, an IBM mainframe, the dusty COBOL code
outperformed what they were trying to replace it with.
COBOL has put food on my table for roughly 27 years now. I am glad I learned it.
Re: Re: RIP Niklaus Wirth
By: tenser to Blue White on Wed Jan 10 2024 01:27 pm
My sense is that Java programmers favor a style that is heavy on frameworks, design patterns, and abstraction, often to the point of excess. This yields all sorts of efficiency problems, among other th
I haven't done much with Java (aside from some Android work), but I've seen this with programmers in some other languages too. Sometimes
people seem to like to make a bunch of different classes to handle different parts of the behavior, and abstract things into the various classes and to interfaces, etc., and that at least makes it more
difficult to learn the codebase for someone new. Sometimes, such
designs seem overly complicated.
Which is why those old languages are still out there. FORTRAN, for example, is still used and has been updated. I think we're up to
FORTRAN 90 now.
But other languages are trying to take over. Julia is a good example here. It's very much like Python, but it's also very fast with a focus
on math. A good potential replacement for FORTRAN.
At some point, these applications need to be rewritten. Oh, now I will have nightmares about the FORTRAN IV code that I worked on in 1984 that, even though I converted it to VS FORTRAN at that time, is still running and hasn't been rewritten into something more modern.
tenser wrote to Dr. What <=-
Fortran 2018, actually. It only bears a passing resemblance
to FORTRAN-77, let along -66 or IV (or earlier).
continue to be. Not only is modern Fortran actually a
pretty reasonable language, it turns out that a lot of the
design of the language lends itself to very aggressive
optimizations; the aliasing rules, for instance, mean that
the compiler can automatically parallalize lots of programs
in a way that other languages (so far) haven't been able to
match. Also, "the math hasn't changed."
tenser wrote to Dr. What <=-
Fortran 2018, actually. It only bears a passing resemblance
to FORTRAN-77, let along -66 or IV (or earlier).
That's what I figured. When I was researching FORTRAN not too long ago, the YouTube tutorials in "newer" FORTRAN were very interesting.
I've always felt that FORTRAN was a "medium level" programming language. That is, not at the assembly language level, but close enough to the hardware that it could be very fast.
It's an awful tool for doing things like UIs, but is wonderful for just crunching numbers.
Anyone who studied CS in the 90s on probably started off with Pascal -
not sure about nowadays.
Bob Worm wrote to poindexter FORTRAN <=-
Re: Re: RIP Niklaus Wirth
By: poindexter FORTRAN to Dr. What on Tue Jan 09 2024 06:31:00
Later, I realized he had a consulting gig and was using us as junior
talent and charging for our time. We got a A+ grades in the class.
The question is, did figuring that out make you laugh or scowl?
tenser wrote to poindexter FORTRAN <=-
I don't know, someone needs to run the AI, right? At least for the time being...
I'm not sure what you mean by "run".... But yeah, right
now humans are firmly in charge of the whole shebang.
Dr. What wrote to poindexter FORTRAN <=-
Other profs, though, really had no idea what things were like.
And none of my college classes prepared me for the office politics.
Dr. What wrote to Bob Worm <=-
My first real IT job was working on the FORTRAN IV code that processes data from a car crash test. All written by engineers (not computer scientists). It got the job done with the resources that they had
(which at the time it was written was a 8K core RAM IBM mainframe), but
it was not nice to change.
Dr. What wrote to Nightfox <=-
My Freshman year in college was the first year CS majors DIDN'T have to use punch cards. That would have been 1983. Some engineers still had
to use them, though.
Dr. What wrote to Blue White <=-
Blue White wrote to tenser <=-
Which is why those old languages are still out there. FORTRAN, for example, is still used and has been updated. I think we're up to
FORTRAN 90 now.
tenser wrote to Dr. What <=-
There's an old joke. Q: "What language will scientists and
engineers be programming in in 50 years?" A: "I don't know,
but it will be called Fortran."
But other languages are trying to take over. Julia is a good example here. It's very much like Python, but it's also very fast with a focus
on math. A good potential replacement for FORTRAN.
At some point, these applications need to be rewritten. Oh, now I will have nightmares about the FORTRAN IV code that I worked on in 1984 that, even though I converted it to VS FORTRAN at that time, is still running and hasn't been rewritten into something more modern.
A friend of mine, now retired, is a former architect for
high performance systems at Intel (did a PhD in transputers,
worked on HPC-stuff his entire career). He likes to say,
"Fortran pays my salary." He didn't personally program in
it himself, but he had some very convincing arguments
about _why_ Fortran was still used and would (and should)
continue to be. Not only is modern Fortran actually a
pretty reasonable language, it turns out that a lot of the
design of the language lends itself to very aggressive
optimizations; the aliasing rules, for instance, mean that
the compiler can automatically parallalize lots of programs
in a way that other languages (so far) haven't been able to
match. Also, "the math hasn't changed."
All that said, it'll be interesting to keep an eye on Julia
and see where it goes. And this is to say nothing about
Matlab, Mathematica, and other interactive environments that
have Fortran-like languages and are popular in science
and engineering.
--- Mystic BBS v1.12 A48 (Linux/64)
* Origin: Agency BBS | Dunedin, New Zealand | agency.bbs.nz (21:1/101)
Dr. What wrote to Blue White <=-
Blue White wrote to tenser <=-
Which is why those old languages are still out there. FORTRAN, for example, is still used and has been updated. I think we're up to FORTRAN 90 now.
I remember when they "Numerical Recipes in FORTRAN" book became
"Numerical Recipes in C". C seemed like the wrong language for the
higher-level stuff that FORTRAN did well.
All consciousnesses have merged with the galactic singularity, except
for one guy who insists on joining through his IRC client.
"I have it set up the way I want, Okay?!"
poindexter FORTRAN wrote to Dr. What <=-
Other profs, though, really had no idea what things were like.
And none of my college classes prepared me for the office politics.
Office Politics 101. Now, that would be a valuable class!
poindexter FORTRAN wrote to Dr. What <=-
"SONNY, WHEN I STARTED OUT WE DIDN'T HAVE THOSE NEWFANGLED TERMINALS,
WE USED PUNCHED CARDS TO ENTER OUR JOBS..." I get to say that now.
poindexter FORTRAN wrote to Dr. What <=-
I remember when they "Numerical Recipes in FORTRAN" book became
"Numerical Recipes in C". C seemed like the wrong language for the
higher-level stuff that FORTRAN did well.
poindexter FORTRAN wrote to Dr. What <=-
I remember when they "Numerical Recipes in FORTRAN" book became
"Numerical Recipes in C". C seemed like the wrong language for the
higher-level stuff that FORTRAN did well.
Ya, I remember the same when P.J.'s "Programming Pearls" got a revamp
into "Programming Pearls for Pascal".
But as languages evolved, and programmer needs changed, it was important that newbys learned good ways of doing things. Otherwise you end up
with even more fodder for The Daily WTF.
I was in my senior year of high school in 1983. We were programming on Commodore CBM machines (PETs with custom ROMS for graphics and 32K of
RAM instead of 8K).
On 07 Jan 2024 at 11:42a, Dr. What pondered and said...
opoindexter FORTRAN wrote to Nightfox <=-
That makes sense, the PASCAL class was used mostly to teach data
structures and algorithms, then everything else was in C, assembler
LISP. In retrospect I'd rather just jump into C or C++ instead of
spending time learning another language.
Pascal was intended to be a teaching language that taught you good
programming practice. So that when you got to C/C++, and didn't have
the seatbelts that Pascal gave you, you didn't kill yourself when your
program crashed.
It was always easy to pick out the kids who'd been
exposed to COBOL and then learned C; their C code
tended to be overly verbose and not terribly idiomatic.
It'd take them a good while to come up to speed.
The ones who came from BASIC had it the worst, though.
tenser wrote to Dr. What <=-
"Programming Pearls" was written by Jon Bentley
(who's father, incidentally, was a Marine at the
Chosin Reservoir during the Korean War).
You may be thinking of, "Software Tools", which was
in ratfor ("Rational FORTRAN" --- a preprocessor that
took a semi-structured language and emitted FORTRAN)
and later translated into Pascal.
That was by Brian
Kernighan and PJ Plauger. Incidentally, that caused
Kernighan to write is, "Why Pascal is Not My Favorite
Programming Language" paper, which is worth a read: https://lysator.liu.se/c/bwk-on-pascal.html
Dr. What wrote to poindexter FORTRAN <=-
Office Politics 101. Now, that would be a valuable class!
As vaulable as Data Structures, FORTRAN and C, that's for sure.
My son is about to graduate with a marketing communications degree. I
don't think AI is going to render him unemployable, I think the jobs are going to change. There won't be tons of human content creators, rather a handful of people who can effectively leverage AI to create content.
poindexter FORTRAN wrote to Dr. What <=-
The kids ended up learning email skills, how to follow up, the
importance of deliverables and meeting deadlines, how to dress and so
on.
IMO it looks to me like some IT fields are also going to bubble. A lot
of IT
consists on creating expectations for a product and then waiting for investors
to roll in, and build vaporware for years without actually delivering.
I have
seen the first medium-sized crash at a local level in a couple of
years. It
actually makes me worried for my friends in IT.
Dr. What wrote to poindexter FORTRAN <=-
poindexter FORTRAN wrote to Dr. What <=-
The kids ended up learning email skills, how to follow up, the
importance of deliverables and meeting deadlines, how to dress and so
on.
We definitely need more of this today. The once a year "Take
your kid to work day" really doesn't cut it.
My current company does intern programs in the summer and it is
very successful. But it takes time and money (since someone has
to manage, find work and mentor the interns) and there are more
than few companies who are too cheap for that (like my previous
employer).
On 07 Jan 2024 at 11:42a, Dr. What pondered and said...
opoindexter FORTRAN wrote to Nightfox <=-
That makes sense, the PASCAL class was used mostly to teach dat
structures and algorithms, then everything else was in C, assem
LISP. In retrospect I'd rather just jump into C or C++ instead
spending time learning another language.
Pascal was intended to be a teaching language that taught you good
programming practice. So that when you got to C/C++, and didn't hav
the seatbelts that Pascal gave you, you didn't kill yourself when yo
program crashed.
It was always easy to pick out the kids who'd been
exposed to COBOL and then learned C; their C code
tended to be overly verbose and not terribly idiomatic.
It'd take them a good while to come up to speed.
The ones who came from BASIC had it the worst, though.
The COBOL kids had shorter fingers, also known as COBOL fingers... ?
:)
tenser wrote to Dr. What <=-
"Programming Pearls" was written by Jon Bentley
(who's father, incidentally, was a Marine at the
Chosin Reservoir during the Korean War).
You may be thinking of, "Software Tools", which was
in ratfor ("Rational FORTRAN" --- a preprocessor that
took a semi-structured language and emitted FORTRAN)
and later translated into Pascal.
You are correct. I got them mixed.
I used to read "Programming Pearls" in the back issues of the Journal of the ACM back in college and had picked up the book. Sadly, I let it go
a long time ago.
I discovered "Software Tools" shortly after that (which is probably why
I got them mixed) and read that one cover to cover. I think I even did some of the code in Pascal (before "Software Tools in Pascal" came out)
as an exercise.
That was by Brian
Kernighan and PJ Plauger. Incidentally, that caused
Kernighan to write is, "Why Pascal is Not My Favorite
Programming Language" paper, which is worth a read: https://lysator.liu.se/c/bwk-on-pascal.html
I vaguely remember reading that a long time ago. PJ is right on his
with his criticism of Pascal.
But as he notes at the beginning
"Comparing C and Pascal is rather like comparing a Learjet to a Piper
Cub - one is meant for getting something done while the other is meant
for learning - so such comparisons tend to be somewhat farfetched."
My college profs were always clear: We are doing stuff in Pascal to
teach you good habits. You won't use Pascal in the "real world", but
the good habits you pick up doing everything in Pascal will serve you well. And they were (pretty much) right.
Blue White wrote to Arelor <=-
That business model is at least part of what lead to the dot.com crash
in the late 1990s.
Blue White wrote to Arelor <=-
That business model is at least part of what lead to the dot.com cras in the late 1990s.
Those were heady times. I loved reading fuckedcompany.com and seeing
companies with horrendously non-viable business ideas, like flake.com,
a social network for people who like breakfast cereal.
A company had their comeout party in San Francisco and closed
operations the next day.
That makes sense, the PASCAL class was used mostly to teach data structures and algorithms, then everything else was in C, assembler or LISP. In retrospect I'd rather just jump into C or C++ instead of spending time learning another language.
Honestly, at this point, I can't think of a good reason
to teach C at the collegiate level.
tenser wrote to Dr. What <=-
I used to read "Programming Pearls" in the back issues of the Journal of the ACM back in college and had picked up the book. Sadly, I let it go
a long time ago.
I think you meant Communications of the ACM; JACM is mostly
theory. :-)
Bwk, but yeah. One of the problems was that they were
working in the context of standard Pascal, so they didn't
have some of the nice-ities that say Turbo Pascal brought
to the language (like a string type). It would have been
interesting to see a version of Software Tools in e.g.
Oberon.
Re: Re: RIP Niklaus Wirth
By: tenser to poindexter FORTRAN on Mon Jan 08 2024 03:51 am
That makes sense, the PASCAL class was used mostly to teach data structures and algorithms, then everything else was in C, assemble LISP. In retrospect I'd rather just jump into C or C++ instead of spending time learning another language.
Honestly, at this point, I can't think of a good reason
to teach C at the collegiate level.
C++ (not C) appears to be the collegiate programming language of choice these days.
It was Java for a while, C before that, Pascal before that,
and FORTRAN before that.
tenser wrote to Dr. What <=-
I used to read "Programming Pearls" in the back issues of the Journal the ACM back in college and had picked up the book. Sadly, I let it a long time ago.
I think you meant Communications of the ACM; JACM is mostly
theory. :-)
I'm not sure. By the time I actually joined the ACM, I got Communications. But I'm pretty sure that the back issues I read were called "Journals of the ACM". But I'm uncertain which had "Programming Pearls", but I think it was Communications.
One of the things that surprises me as I get into vintage computers is
how much I mis-remember.
Bwk, but yeah. One of the problems was that they were
working in the context of standard Pascal, so they didn't
have some of the nice-ities that say Turbo Pascal brought
to the language (like a string type). It would have been
interesting to see a version of Software Tools in e.g.
Oberon.
Ya, the Software Tools was mainly about writing unix-c-like stuff in various languages. But C was becoming the dominate tool by then and Software Tools in OtherLanguage wasn't that much in demand.
C++ (not C) appears to be the collegiate programming language of
C++ (not C) appears to be the collegiate programming language of choice these days. It was Java for a while, C before that, Pascal before that, and FORTRAN before that.
boggles the mind. Hoping people move to Rust. C++ is a disaster, and I
can safely say that as someone extremely proficient in the language (up
to C++20)
C++ (not C) appears to be the collegiate programming language of
Every time I see students come out of these courses, or the courses themselves, it's really "C with a C++ compiler and we used a stream".
boggles the mind. Hoping people move to Rust. C++ is a disaster, and I can safely say that as someone extremely proficient in the language (up to C++20)
rust isn't the answer. it just allows you to write crap code that doesn't crash. some might think that's a good thing, but it's just another in a long line of bloated junk.
I live near Santa Cruz, and drive by the old Borland building
occasionally, Seagate's old office (the address was 1 disk drive...) and where I took a UNIX training class at SCO.
tenser wrote to Dr. What <=-
the rest of the research community). So Communications
used to have a lot of papers that were kind of systems-y,
but much less these days. Communications nowadays is
more like a magazine.
Software tools took on a life of its own outside of the
books, and was a thriving project for quite a while,
particularly in the minicomputer era. E.g., it was quite
popular on Pr1me computers.
These days, of course, there's a C compiler for everything;
back then there was a Fortran compiler for everything, and
then a Pascal compiler for everything since that was the
language of teaching for so long.
Oberon was the last of
Wirth's languages, and in many respects, it was closer to
C than to Pascal; as such, it remedied many of the short
comings that Kernighan noted in his polemic about Pascal
(for example, in Oberon, the size of an array is not part
of its type, like in C). Had there been an Oberon version
of the book, it may have been a more natural presentation,
like a C version, for many of the utilities. Of course,
Oberon didn't exist at th time.
tenser wrote to Digital Man <=-
Yeah. My sense observing those classes was that
Pascal was used in the lower year classes, then C
for things like compilers, OS, etc. At one point
I saw a COBOL class offered. *shudder*
rust isn't the answer. it just allows you to write crap code that doesn't crash. some might think that's a good thing, but it's just another in a long line of bloated junk.
I've heard claims like this before, but I haven't experienced it myself.
I came to Rust very skeptical, but figured if it could deliver on even a quarter of its claims I'd be way ahead of C; it's pleasantly surprised me, and I've been using it professionally for about 5 years now. One _can_
I've heard claims like this before, but I haven't experienced it myse I came to Rust very skeptical, but figured if it could deliver on eve quarter of its claims I'd be way ahead of C; it's pleasantly surprise and I've been using it professionally for about 5 years now. One _ca
What are you using it for? I've heard people like Rust, but none of the companies I've worked at have used Rust at all.
Yes. A lot of programmers seem to _really_ love complexity,
and some I'm sad to say view their ability to handle complexity
as a sign of superiority over those around them who, perhaps,
can't keep quite as much in their heads at one time. It's not
great.
I remember when they "Numerical Recipes in FORTRAN" book became
"Numerical Recipes in C". C seemed like the wrong language for the
higher-level stuff that FORTRAN did well.
Office Politics 101. Now, that would be a valuable class!
As vaulable as Data Structures, FORTRAN and C, that's for sure.
Adept wrote to Dr. What <=-
Teaching social skills to a nerdy crowd is probably hard work, too, so good teachers there would be _invaluable_.
But the second time through? It was fun! Just kinda neat to see how the various things fit together.
Yup. Plus having teachers who have spent some time in the corporate
world would have been very helpful. I don't think I've ever had a
teacher that hadn't spent his whole career in academia.
My dad taught 8th grade science, so I was exposed to that at a very
early age. That laid the base for me when he bought home a TRS-80 Model
I for the summer.
Adept wrote to Dr. What <=-
That _does_ make sense, though that sort of thing is so hard --
teaching _is_ a skill, so it kind of becomes like training an astronaut
to drill, or teaching drillers to be an astronaut.
repair logic skills. And with those skills I started doing tech support
at least by 5th grade when I got called out of a class to fix a
computer.
And, while I pride myself in being able to explain technical things to less-technically-inclined people, I've never had the slightest clue on
how to get people to _think_ in that sort of fashion, even for people
who do well with logic outside of the computer realm.
They may not be the best teachers, but they do have more knowledge than
a teacher who has never had a "real job".
don't prepare you for that. And for geeky people, that prepartion would have been very useful.
They were bored until I told them that they could program the computer
to do their math homework. They had so much fun that they didn't
realize that they worked harder to write that program than the would
have done just doing the homework.
And I found that you can't get people to think in certain ways. The
best you can do is explain things in many different ways, hoping that
one will stick.
Adept wrote to Dr. What <=-
It does seem like something that might be ideal as guest lecturers of
some sort.
But always kind of hard to say what would be best, at least without
lots of well-designed studies.
And who knows how one _tests_ for such things.
Neat! I do remember, in my Physics class, where we could put stuff onto the calculator as notes for whatever we were doing.
So I wrote a program for a particular set of problems, which, of
course, meant that I knew the formula _really_ well, rather than it
being remotely useful as a cheat sheet.
They may not be the best teachers, but they do have more knowledge th a teacher who has never had a "real job".
It does seem like something that might be ideal as guest lecturers of
some sort.
the prof gave us the specs. Every other class or so, he would change
the specs - just a bit - or add another small requirement (scope creep). At the end, he told us why he did it that way - to simulate what you
will have to deal with on the job.
I did the same thing for Chemistry class in high school. My classmates said "The teacher won't accept the printout". But he did - along with
the source code for the BASIC program I wrote. He then said I didn't
have to do the homework for this anymore because if I could teach a computer how to do it, I must have mastered the concept.
Adept wrote to Dr. What <=-
Though, in school, I remember switching IDE, language, code repository, and probably lots of other things with each different set of classes,
so by the time things became requirement creep it just seemed like,
"okay, someone else has another set of requirements for us to deal
with".
But, yeah, not _quite_ the same, since theoretically the professor gave enough information at the beginning to know the final outcome.
We handed in the printout along with the logic, but it just got a
question mark or something on it, so I think we just confused the professor. But someone used to humans doing logic things is going to
have a different idea of what's possible to brute force than someone
who thinks it through with a computer.
Closer to the real world than you might think. Especially if the tech lead of the project suffers from neophilia.
It's no wonder why their IT dept stayed in the 70's so long.
Kinda make me think that that college needed to take a page from
Dartmouth and their BASIC program. It got a very high percentage of people involved in computers - and not just the one going into computers.
Adept wrote to Dr. What <=-
Obviously, with K-Mart, it wasn't blockchain or AI that would've been
the issue. And we're mostly talking about business people. For tech people, honestly, while it's exhausting to be changing things all the time, it's also pretty exciting to be trying new things all the time.
The other problem is that something that already works is generally
going to outperform something that might be better, but will have
teething problems. Sticking in the 70s is fine for a lot of things.
I've experienced too much pain where we had a developer who decided New Tech X was cool and then built a critical production process around it. Never mind that no one else on the team was interested in New Tech X, didn't know New Tech X and New Tech X didn't do a much better job than the current tech.
Bob Worm wrote to Dr. What <=-
Oh, yes. A hundred times... and usually said dev then leaves the
company and nobody can maintain this completely bespoke thing they left behind with no documentation!
As one of my university lecturers used to say - there's a difference between being clever and being smart...
I've experienced too much pain where we had a developer who decided New Tech X was cool and then built a critical production process around it. Never mind that no one else on the team was interested in New Tech X, didn't know New Tech X and New Tech X didn't do a much better job than
the current tech.
problem. Many companies out there will support you - for an arm, a leg and your first born. But that's usually a good indication that you
should be moving on. And they were trying to do that, they just kept derailing themselves.
I used to read "Programming Pearls" in the back issues of the Journal the ACM back in college and had picked up the book. Sadly, I let it a long time ago.
I think you meant Communications of the ACM; JACM is mostly
theory. :-)
C++ (not C) appears to be the collegiate programming language of choice these days. It was Java for a while, C before that, Pascal before that, and FORTRAN before that.
the rest of the research community). So Communications
used to have a lot of papers that were kind of systems-y,
but much less these days. Communications nowadays is
more like a magazine.
Ya, I let my membership lapse because they were moving away from computer technology and getting into more social-issues-and-stuff.
Adept wrote to Dr. What <=-
Yeah, shiny new things. Though, also, it's generally more fun building something up yourself than figuring out what the heck the last twenty people were doing with this code that's supposedly self-commenting.
Adept wrote to tenser <=-
I get Communications of the ACM, and it doesn't seem to have
Programming Pearls, at this point in time.
Which means _nothing_ for this discussion, but it sounds like an interesting column.
Though Communications of the ACM is still interesting, though I'm not
sure I've found any of it to be useful, practically, at this point.
Or, heck, not sure how useful my ACM membership has been, but I like
being a part of it, regardless.
I've dealt with (and still deal with) some developers who have an
attitude of "I don't need to document it. It's self-evident." Ya, to you, who did the research and wrote it. But it's greek to someone else who didn't.
For my magazines, I use a 3 issue test: When the renewal comes up, I look through the last 3 issues. If I can easily find at least 1 article that
I want to keep for a long time, I renew. If not, it's not worth my
money.
I used to think the same way. Even ignoring my 3 month test. But it reached a point that, for me, I really didn't want to be associated with it anymore.
it anymore. And I had to give up my acm.org email address that I had
for ages.
Adept wrote to Dr. What <=-
it anymore. And I had to give up my acm.org email address that I had
for ages.
And you actually _used_ it?
Or I'm worried that my personal domain is getting e-mails sent to spam folders. But I long ago went away from having e-mail connected to
things that other people control, because they tend not to have my interests at the front of their mind.
That's why I don't use my gmail account anymore. But setting up my own domain, email server, etc. was just too much work to be of value to me.
I did locate a service that I seems to be trustworthy (so far). It's
pay, of course, which helps to push it to the more trustworthy category.
I used to read "Programming Pearls" in the back issues of the Jo the ACM back in college and had picked up the book. Sadly, I le a long time ago.
I think you meant Communications of the ACM; JACM is mostly
theory. :-)
I get Communications of the ACM, and it doesn't seem to have Programming Pearls, at this point in time.
Which means _nothing_ for this discussion, but it sounds like an interesting column.
That's why I don't use my gmail account anymore. But setting up my own doma email server, etc. was just too much work to be of value to me. I did locat service that I seems to be trustworthy (so far). It's pay, of course, which helps to push it to the more trustworthy category.
I liked the Tagline about XEROX Alto You used.
Why isn't Multimail available in the APP Files on this thing.
Sometimes I have to "Praise The Lord ANYHOW".
Ed Vance wrote to poindexter FORTRAN <=-
The Tagline "Albatetize the Alphabet" brought to mind this:
ZYXW VUTS RQ PONML KJIH GFED CBA
Those spaces between the backward Alphabet mean pause(s).
On a good day I can say the above in 3 point 5 seconds.
On a Prodigy(sp?) CD years ago was a C- - Program and I
think some instructions for C - - too.
NOW C=64 BASIC and IBM DOS 2.11 BASIC I DUG.
Sysop: | digital man |
---|---|
Location: | Riverside County, California |
Users: | 1,063 |
Nodes: | 17 (0 / 17) |
Uptime: | 10:39:14 |
Calls: | 501,113 |
Calls today: | 6 |
Files: | 109,393 |
D/L today: |
413 files (40,863K bytes) |
Messages: | 300,367 |