Any of our IT folk still able to code in COBOL?

Ok... but how many of youse guys have ever used 96-column cards?

I spent a lot of time in the 80s working on some of IBM’s more weird and wonderful hardware. S/3, Series/1, and some of their comm controllers like the 3705. What a time to be in the business.
 
In 87 I worked in the same office as someone who discovered it was possible to crash a Cray with FORTRAN code. The sysadmins insisted it wasn’t possible. But my recollection is that the bounds of an implied loop within a print statement were not correctly checked by the compiler or system. Oops.
 
Several decades ago, a big NJ insurance company needed lots of COBOL programmers. They put together a training program that could transform someone with zero programming or technical experience into a “professional” COBOL programmer in 30 days. Secretaries, accountants, used car salesmen, bartenders - all became programmers.

It’s not rocket science...
No, rocket science is using JOVIAL and 2005 Ada, not even 2012 Ada.
 
Ok... but how many of youse guys have ever used 96-column cards?

I spent a lot of time in the 80s working on some of IBM’s more weird and wonderful hardware. S/3, Series/1, and some of their comm controllers like the 3705. What a time to be in the business.
Two sections of round holes. Univac, IIRC.
 
Ok... but how many of youse guys have ever used 96-column cards?

I spent a lot of time in the 80s working on some of IBM’s more weird and wonderful hardware. S/3, Series/1, and some of their comm controllers like the 3705. What a time to be in the business.
Oh me! Me!

I worked as a night operator for a data processing spin off from an insurance company. We had an IBM system 3 model 10. I learned RPG there as well.
 
One was in C, and the only comment in the entire control system was:

h = x; // hypotenuse

My first job out of college was that I inherited a major database project written in Fortran and MACRO-11 on RSX-11M. (I had a research grant in college to do a FORTRAN/macro-11 database so when I interviewed they jumped on that aspect of my resume). The guy who wrote the original code was still around. One of the "software" engineering types ran some metric software on the code and came up to the former author and said "Do you know that the software reports that there's only one line of comments in the entire package?"

He said that the software was clearly in error, there weren't any comments at all.

I investigated it and found it counted the ".TITLE" directive at the beginning of one of the files as a comment.
 
Ding ding. If your code needs comments then the code is unclear and needs to be refactored. This is normal. The difference between the amateur and the pro is how often, how quickly, and with what strategy they refactor. By the time I submit a patch of code for review I’ve probably refactored it half a dozen times. The amateur wouldn’t dare refactor their code for fear of it breaking and not understanding how it really even works.

There are some situations where comments are justified , but they are very very few, and should describe why something is being done..not what is being done...for example “//this operation is tried twice because the api is unreliable, see Vendor XYZ ticket #21740.”

A professional engineer on my teams likely only adds about one comment to the code base per quarter.

If you can’t figure out what the code is doing without the comment then it’d never goto production on any of my teams...everything is code reviewed and must meet published standards...I even have a review process that is done on the reviews themselves :)
I utterly disagree, having looked a way too much code from others. If there are ten ways to code something, two may come to mind, and the other eight may not be quickly recognizable. I'm talking about ANSI C here, btw, nothing newer.
 
I utterly disagree, having looked a way too much code from others. If there are ten ways to code something, two may come to mind, and the other eight may not be quickly recognizable. I'm talking about ANSI C here, btw, nothing newer.

A group of Sr Software types (I was one) were having a discussion about resume screening once. The question was, if a job applicant cited having won the Obfuscated C contest (https://www.ioccc.org/) (and assuming they did win) was that a plus or a minus. There was no clear consensus.

That level of coding shows a really good understanding of the language and it's subtleties, but my, oh my, that's some ugly clever code.
 
Two sections of round holes. Univac, IIRC.

Yep, 90 holes in two banks. Even though I worked on UNIVACs I never saw a punch or reader for such. You'd occasionally see a punched card stuck in the paperwork for a machine. The systems I worked on all had standard format cards (with some odd extra punches for things that existed on the UNIVAC and not on IBM).

IBM came out later with a smaller punched card that had 96 column card with a sort of BCD format in three banks. Never saw that punch either, though you'd see the cards in some early point of sale stuff.

I had this sitting on my desk for years. Very, very few people could identify it:

49756170358_ceb852a7ca_o.jpg
 
One of my specialties over the years was figuring out and debugging other people's code. One of my coworkers would note when I identified a mistake but hadn't the inclination to do anything about it, I'd just mark it with a comment BOGUS. Randy realized that the number of Os in bogus was an indication of how bad the construct was. Ones like BOOOOOOOGUS were particularly bad.

I came across one that was apparently neatly coded and commented that read:

// This takes the average of all the up angles and sets the mosaic to that.

Well first, it's not clear why you'd want to use that (imagine you through a bunch of playing cards on the table in random order and you're trying to decide how to orient a photo of them all). Second, the way you average angles together is not to add them up and divide by n. An image at rotation 10 degrees and 350 degrees should average out to 0 not 180. There was also a fencepost error in the code so the last image wasn't even considered.

All programmers made mistakes. My best programmers never made the same mistake twice after I pointed it out to them. Others never learned and I eventually had to let them go.
 
A group of Sr Software types (I was one) were having a discussion about resume screening once. The question was, if a job applicant cited having won the Obfuscated C contest (https://www.ioccc.org/) (and assuming they did win) was that a plus or a minus. There was no clear consensus.

That level of coding shows a really good understanding of the language and it's subtleties, but my, oh my, that's some ugly clever code.
Yeah, I did a local one of those once. But I didn't win, someone did something that had a result that was so far from obvious (even when stepping through the code) that I was seriously impressed. I'm just not that cruel; I write the simplest code that will do the job (same when I design electronic circuits; do the basics, add stuff only if needed.)
We used to judge resumes based on use of language, BTW. If they had no concept of English we'd bin 'em.
 
Yeah, I did a local one of those once. But I didn't win, someone did something that had a result that was so far from obvious (even when stepping through the code) that I was seriously impressed. I'm just not that cruel; I write the simplest code that will do the job (same when I design electronic circuits; do the basics, add stuff only if needed.)
We used to judge resumes based on use of language, BTW. If they had no concept of English we'd bin 'em.
I teach my students that the most elegant code is the simplest that satisfies the problem. I also teach them that the program is not the solution to the problem, but one implementation of the solution in that particular language. I insist on a complete and comprehensive design in english - remember, I’m teaching at this point.

Well written code is a joy to read.

Example: I make them write, in english, all the steps to do I/O. But when I’m doing design for $$$, my design will state something like “write it to filename” because I already know, all the gory, detailed step needed in that phrase.

Practice, practice, practice.

Lat summer one of my students walked in wearing a hoodie with the phrase I keep pounding into their heads... “Think first, Code last”. I got thru to one of them!
 
I teach my students that the most elegant code is the simplest that satisfies the problem. I also teach them that the program is not the solution to the problem, but one implementation of the solution in that particular language. I insist on a complete and comprehensive design in english - remember, I’m teaching at this point.

Well written code is a joy to read.

Example: I make them write, in english, all the steps to do I/O. But when I’m doing design for $$$, my design will state something like “write it to filename” because I already know, all the gory, detailed step needed in that phrase.

Practice, practice, practice.

Lat summer one of my students walked in wearing a hoodie with the phrase I keep pounding into their heads... “Think first, Code last”. I got thru to one of them!
I often use a technique where I write the design out in comments and then delete the comments as I implement it and the variable, function and class names, etc replace the comments with actual details. It leads to very readable code.
 
Two sections of round holes. Univac, IIRC.
IBM used the smaller 96-column cards in their System/3. I worked on quite a few S/3s, but I can only recall one customer that actually had punched-card equipment for theirs. There was a magazine publisher in Cleveland that had the largest System/3 I ever encountered, and they used cards for a lot of stuff. Most of the others I worked on were much smaller, and I don't recall any that had PCM.
 
I teach my students that the most elegant code is the simplest that satisfies the problem. I also teach them that the program is not the solution to the problem, but one implementation of the solution in that particular language. I insist on a complete and comprehensive design in english - remember, I’m teaching at this point.

Well written code is a joy to read.

Example: I make them write, in english, all the steps to do I/O. But when I’m doing design for $$$, my design will state something like “write it to filename” because I already know, all the gory, detailed step needed in that phrase.

Practice, practice, practice.

Lat summer one of my students walked in wearing a hoodie with the phrase I keep pounding into their heads... “Think first, Code last”. I got thru to one of them!

I've kept this cartoon in my files for decades. I pull it out when needed.
Screen Shot 2020-04-10 at 10.05.35 AM.png
 
I often use a technique where I write the design out in comments and then delete the comments as I implement it and the variable, function and class names, etc replace the comments with actual details. It leads to very readable code.
I did similar, until one day I had to debug some old old code - and who better but the old old dog in the company? :)
As I dove into it, my first thought was "what the hell is this and who the hell wrote this?
then it started to look familiar ...
and then it dawned on me that "I was 'that guy'" ... it was mine.
I started leaving the header comments in place, as well as some inline comments on those obtuse sections ...
 
I did similar, until one day I had to debug some old old code - and who better but the old old dog in the company? :)
As I dove into it, my first thought was "what the hell is this and who the hell wrote this?
then it started to look familiar ...
and then it dawned on me that "I was 'that guy'" ... it was mine.
I started leaving the header comments in place, as well as some inline comments on those obtuse sections ...

LOL I've had knock-down fights with other programers that want me to explain some bit of code that I swear I had nothing to do with, only to find some incriminating evidence later that I was the perpetrator.

I curse my past self often, but even more often I find he was smarter than I am. And of course, I put comments in to explain something that doesn't seem logical at first glance. But I don't explain things that don't need to be explained. I'll refactor before commenting any time possible.
 
I did similar, until one day I had to debug some old old code - and who better but the old old dog in the company? :)
As I dove into it, my first thought was "what the hell is this and who the hell wrote this?
then it started to look familiar ...
and then it dawned on me that "I was 'that guy'" ... it was mine.
I started leaving the header comments in place, as well as some inline comments on those obtuse sections ...

I often use a technique where I write the design out in comments and then delete the comments as I implement it and the variable, function and class names, etc replace the comments with actual details. It leads to very readable code.
A really good design IS the comments (there's something wrong with this syntax - doesn't sound correct but it is)
 
Back when I worked for Tandem the software that ran our Open Area Test Site (OATS) and RF semi-anechoic chamber was written in HP Basic. By my at the time boss, who had his PhD in radio astronomy from Stanford. HP calling that language Basic was insulting the language. Yes, you could limit yourself to Basic commands, but it allowed far more than that. I'm not sure what the limitation was on the length of a variable name, but it was long enough that fully descriptive variable names worked great. Subroutines with true independent variables. IF/THEN/ELSE structures. Etc. WIth REM lines I maintained and updated this program for 10 years. I even came back to visit after being gone from the company for a year and was able to, in a matter of minutes, fix a problem in the code.

At the other end of the spectrum, there was a digital/analog hybrid computer in the EE department at Washington State University when I was in college that I don't recall ever seeing anyone use as a hybrid. I gook a 1 semester class on how to program the analog half of it (haven't touched an analog computer since) and was one of a small handful (as an undergraduate, no less) who ever touched the digital half. Turn-around on the IBM 360/67 over in the computer center had to be horrendous and you had to be desperate to ever touch it. To run a FORTRAN job took the following steps (with plenty of switch throwing on the console):

1. Load a paper tape (thank goodness it had an optical reader) that gave you an editor. Type in the program you wanted to run using the teletype terminal.
2. Once you were happy with the source code you had written, flip a few switches on the control console and punch out a paper tape of your program.
3. Load a compiler tape, followed by your source code tape, and after it was finished compiling your program it would dump out an object code tape. Assuming you didn't have any errors in your source code, of course.
4. Load a run time module tape, followed by your object code tape and the program would run. This assumes that you had no errors in your program that messed up the results.
5. Dump the output to the printer.

As I said, you had to be desperate to use it. It did have the advantage of always being available due to the mess involved in running it. :D

When it comes to computers, these (meaning now) are the "good old days".
 
LOL I've had knock-down fights with other programers that want me to explain some bit of code that I swear I had nothing to do with, only to find some incriminating evidence later that I was the perpetrator.
'git blame' to make sure it wasn't me who screwed up.
 
'git blame' to make sure it wasn't me who screwed up.
I'm talking about legacy code that's been through 2 or 3 source control migrations over it's lifetime. lol
 
I've kept this cartoon in my files for decades. I pull it out when needed.
View attachment 84551
I think every software shop has that pinned on the wall somewhere. (at least everywhere I've worked.)

And, yes, after 15 minutes I asked "This doesn't make any sense at all, who was the idiot that wrote this?", the answer being "You wrote it, that's why I'm asking you about it."
 
If the price was right, I would do it in a heartbeat.... it is surprising how many systems use it
 
'git blame' to make sure it wasn't me who screwed up.
The first place I worked as a software engineer, there were two engineers developing a Z-80 based embedded SBC and developing the boot ROM and operating software for it. They had a convention that one wrote all their assembler code in lower case and the other wrote all their code in upper case. The cross assembler we used was not case sensitive. They did this so that when a bug was found they could tell who to blame.
 
The first place I worked as a software engineer, there were two engineers developing a Z-80 based embedded SBC and developing the boot ROM and operating software for it. They had a convention that one wrote all their assembler code in lower case and the other wrote all their code in upper case. The cross assembler we used was not case sensitive. They did this so that when a bug was found they could tell who to blame.


I presume if you were the third guy you had to aLtErNaTe between cases to identify your code...?
 
Ah, the good old days. My first paying job programming was while I was still in college, and was written in Fortran IV. We had an IBM 360 in the computer center. It was faster than the 1130 in the keypunch room, but you didn't have to wait for hours to get your results with the 1130.

I don't have experience in all the languages mentioned in this thread, but enough to get me by. Give me 3 weeks to get up to speed in COBOL, and I'd be good.
 
I did similar, until one day I had to debug some old old code - and who better but the old old dog in the company? :)
As I dove into it, my first thought was "what the hell is this and who the hell wrote this?
then it started to look familiar ...
and then it dawned on me that "I was 'that guy'" ... it was mine.
I started leaving the header comments in place, as well as some inline comments on those obtuse sections ...
When I worked at Sun I was asked to look into a failing CPU module. It turned out to have bad cache. I wrote several utilities to expose the problem and gave them to the CPU group. Later, I asked them for help on another issue and what did they give me? My original utilities!
 
When I worked at Sun I was asked to look into a failing CPU module. It turned out to have bad cache. I wrote several utilities to expose the problem and gave them to the CPU group. Later, I asked them for help on another issue and what did they give me? My original utilities!
Many years ago University of California Bereley released a version of UNIX with their own shell that had a neat feature that allowed you to move jobs back and forth between the foreground and background, called the CSH. Syntactically, it was an ugly language. I preferred the standard UNIX (Bourne) shell, but I liked the csh job control, so I figured out how it and the Bourne shell worked and hacked job control onto that shell. About the same time a guy named Korn at Bell Labs had done the same thing, but his shell hadn't been distributed.

Two things happened later on. First, googling my name used to always show who had Linux docs online because I had sat down with the pdksh guys at one point and explained how job control worked ao they could put it in their shell, so they gave me credit. Second, the CMU Mach project picked up the shell unbeknownst to me so it went into a lot of systems based on that like the NeXT and the later Mac Unix. One day years later I'm sitting down at a MIPS workstation and typed one of the job control commands. "Job control not enabled" it said. Hey, that sounds like something I wrote. "Set +J" I type. "Job control enabled." Holy crap, this is a Ron shell.

The other funny one was I wrote one of the first internet routers. Since I was actually an employee of the US Army at the time, the code was essentially in the public domain, so I sent out a lot of free copies of it to people. One site that was using it was the Space Telescope Receiving Lab at the Hopkins Campus.

I get a call one day.
STRL: We brought a VAX up on our network and now the gateway is printing errors.
ME: Really, what sort of errors?
STRL: It's coming from the Interlan driver. (not surprising that's the Ethernet interface).
ME: What's it say? Is it printing out a status register? (most of the errors are usually just dumps of various device registers).
STRL: (getting really cagey now) Well, it's something about trailers.
ME: (thinking a bit, oh yes) Is it "Trailers make me barf?"
STRL: Yes, that's it.

I guess they didn't want to come right out and say it.
 
When I worked at Sun I was asked to look into a failing CPU module. It turned out to have bad cache. I wrote several utilities to expose the problem and gave them to the CPU group. Later, I asked them for help on another issue and what did they give me? My original utilities!

Didn’t know you worked there. I really loved Sun gear in telecom.

Freaking tanks that just churned out cash with solid OSes (both SunOS and Solaris).

Throw in a dash of Informix and HA cluster with Veritas, and it was a monster database hauler.

The Enterprise branded stuff just before the fall was just superb hardware.

We had a massive water disaster caused by maintenance being done many floors up create a scene that looked like a touch less car wash come raining down into a small closet server room at one company I worked at.

The servers still running afterward? The Enterprise 450s and 480s.

I have no earthly idea how.

We carefully powered them down and dried them out and they went back into service with complete replacements on the way that got rolled in over the next four days, two of those being ordering and shipping.

In my world overall after Sun died, Linux replaced it. And we had em all. AIX, HP-UX, BSD, SCO... nothing ran like our Sun stuff.
 
Last edited:
Thinking of disaster data centers. I was contracting with DOJ in DC. They had an office where the data center was located by connected basement to the building next door.
Over about six months, the bar above the data center had some patron set off the fire alarm and the sprinklers multiple times. The data center had about 6in of standing water in there the first time. All the equipment was fried and was replaced. Made for an interesting data center.

Tim

Sent from my HD1907 using Tapatalk
 
Back
Top