Any of our IT folk still able to code in COBOL?

Oh does THAT bring back some painful memories!

When I was first introduced to C and UNIX (version 6. Yes, that long ago. Predates and standard libraries for C.) I had been writing some Pascal code on Digital Equipment Corp RSX-11. The non-pc explanation of the differences went as follows: "Pascal and RSX-11 are like a virtuous woman. You have to play by the rules and you still may not be able to do what you want. C on UNIX is like a loose woman. You can do anything you like, but if you catch some horrible disease, that's your problem."

Picking up C written by a FORTRAN programmer (or COBOL for that matter) can be a harrowing experience.

Conversely, picking up a C program written to do much of what C++ does but without the OO constructs (particularly late binding) can be a debugging nightmare as well. I spen 3 years working on a C system that had some 1.2 Million LOC (actual executable statements, not just source file lines) and in excess of 1000 libraries which were constructed to be late binding variations of essentially instances of classes. It started out as an elegant design but had been maintained for over a decade by large groups of programmers with varying levels of expertise. I can tell you some stories...

I, too, would have to be paid a lot for such a project.

John

I taught myself C using Borland Turbo C. In fact, I had taken a Fortran written matrix inversion and linear equation solving routine and converted it to C, and made it in to a graphical pump curve sizing program. It will still run under a DOS shell. I really want to make a Windows version but the graphics make up 90% of the code, and I have zero knowledge of the Windows programming environment.
 
Borland had a nice system for C. (And Pascal. Turbo-Pascal.)

I'm largely self taught in C as well. I tell people it's like learning to drive in a formula 1 car. Most folks will crash and burn, but the ones who survive are probably pretty good...
 
I was taught COBOL in college (‘93-‘94). C was a math elective. Sigh.

Actually did use cobol for a couple years after graduating. Never touched it since. Now, one of my clients has some side program a guy wrote in cobol however long ago to merge some files together into a single file for feeling into another system. We need to write it in our software. They guy sent us a bunch of the source code / copy books.

Like a trip down amnesia lane...


Sent from my iPhone using Tapatalk
 
Although verbose, COBOL wasn't bad for what it was designed for - merging and filtering files to produce huge stacks columnar reports with nice subtotals interspersed.

I programmed in Pascal a lot back in the 80s. It was a pretty decent language for PCs. It reminded me of a watered-down version of PL/1 (which I liked a lot).
 
Borland also had an Ada compiler. Did my masters' project in Ada - wrote a self-serve menu ordering system in Ada (yeah, like McDonalds has now only then it was "DON"T TOUCH THE SCREEN" instead of touchscreen!). shortly after, wrote a bunch of Ada code in Oracle's add-in pkg Pro*Ada to Oracle database for the Navy's air launched missile QA system. "Would you like fries with your Harpoon?" :)
 
Borland also had an Ada compiler. Did my masters' project in Ada - wrote a self-serve menu ordering system in Ada (yeah, like McDonalds has now only then it was "DON"T TOUCH THE SCREEN" instead of touchscreen!). shortly after, wrote a bunch of Ada code in Oracle's add-in pkg Pro*Ada to Oracle database for the Navy's air launched missile QA system. "Would you like fries with your Harpoon?" :)

One of our guys wrote a iPhone app that would connect to the simulation system and do (simulated!) call for fire from the (simulated!) artillery. Need artillery support? There's an app for that...
 
I last programmed in COBOL circa 1981. No desire to do it again. Coding in COBOL is like describing a mathematical operation in English rather than simply writing an equation. In other words, yuck.
Over the decades, updated versions of common languages were common, and to some extent, still are. For example, FORTRAN 66, FORTRAN 77, FORTRAN 90 then we got (*painfully*) Object-Oriented FORTRAN (No, I'm not kidding). Pascal morphed into Modula. And the same held true for COBOL. Much like Bjorn improved C to C++, the new iteration of COBOL is ADD_1_TO_COBOL.

But in reality COBOL doesn't migrate easily to anything else. It was designed and still superb at it's original use - munging mass quantities of structured data. It's trivial to use a relational database with it, you don't even realize the DB is there. I have no idea what the current systems look like from the users POV, but assuming a web front-end, I also assume the back-end of the web form validates the data (phone number format, etc) then formats the data into a usable record(s) that are placed into the database. Then the COBOL code does the rest of the processing for everything. So the major problems are 1) capacity & speed of the database and 2) processing speed of the hardware. Everything eventually become assembly language, so migrating to python is irrelevant. We're talking scale at this point. I'd almost recommend a NoSQL solution but with the highly structured data, that really doesn't make much sense, either.

First rule - add more memory. Second rule, add more disk space. Unfortunately, some older DBs can't handle more disk space. They're "hard wired" to the physical size of the device and can't use more than 1 physical device for the database. Worked on one of those in the late 80s....I blew out the DB in less than a week, so we moved to another vendor. Original vendor solved the problem months later but even so, went out of business or got bought by someone else.
 
Although verbose, COBOL wasn't bad for what it was designed for - merging and filtering files to produce huge stacks columnar reports with nice subtotals interspersed.

I programmed in Pascal a lot back in the 80s. It was a pretty decent language for PCs. It reminded me of a watered-down version of PL/1 (which I liked a lot).
PL/I was intended to be all things to all people. It was huge but made up for it by being slow.
Pascal was never intended to be a production language but strictly as a teaching language.

This started out as a document titled "How to shoot yourself in the foot in C" and has been growing over the years....decades....BTW there are over 250 documented programming languages at last count (about 3 years ago when teaching Principles of Programming Languages)

https://www-users.cs.york.ac.uk/susan/joke/foot.htm

http://www.toodarkpark.org/computers/humor/shoot-self-in-foot.html
 
Last edited:
They taught me Fortran in engineering school in the 1960s, on an IBM 7094. Then I decided to write a program to calculate pi, and it got to a few hundred places on a five-minute student pass. Next I took an elective course in assembly language, and was able to calculate it to a few thousand places. I didn't check all the digits, but I still have pi memorized to 15 digits.
 
In our trip down memory lane, the most bizarre language I ever used was APL. Hyroglyphics read right to left. Great for matrix inversions, but horrible to maintain.
 
Borland also had an Ada compiler. Did my masters' project in Ada - wrote a self-serve menu ordering system in Ada (yeah, like McDonalds has now only then it was "DON"T TOUCH THE SCREEN" instead of touchscreen!). shortly after, wrote a bunch of Ada code in Oracle's add-in pkg Pro*Ada to Oracle database for the Navy's air launched missile QA system. "Would you like fries with your Harpoon?" :)
I was part of a team putting together an air defense system Ada. My part was "hooking" targets and putting up display blocks on the screen. The screens were these enormous CRT's that weighed as much as a pickup and about the size of of a commercial photocopier. The cool part was writing an algorithm to do a two-dimensional binary search. The language itself wasn't too bad, just cumbersome.

Now in the last couple months I've been writing code in Ruby, Java, Groovy, JavaScript, bash and displaying via HTML, CSS and Vue.js. Oh for the days of one programming language at a time! :confused:
 
I never was a programmer "for real". Just never was asked to do anything with software... I was a hardware guy, going back as far as the Army's S/360s (truck mounted, mind you) when the rest of the world was installing 4331s.

I learned BASIC, did a few college COBOL courses while I was in the Army. That was all through correspondence courses -- write the program, send it in, get it graded, but as far as I know no single line of COBOL that I wrote ever actually ran on a computer anywhere. Then there was a little PASCAL on a borrowed Apple ][, with a side trip into S/360 assembler hand coded to binary and programmed in via front panel switches. Learned 8080/Z80 assembler... then 8031... then didn't do anything for years. I learned Perl, and hope to never see or touch it again. Then I taught myself C from a couple of books, because I just had trouble with PIC assembly code. That was the closest I came to getting paid to write software... well, firmware. Spent the next 15 years or so making a nice little side business out of PIC firmware written in C, but all for my own products. Since offloading that business, I've started toying around with Arduinos a little bit... but I'm still not a "real" programmer. I just hack away until it works. I'll probably need to learn Python before long, as my professional world revolves around Splunk.
 
Dumb question - why would this be limited to IT folks (per the thread title).

I am positive I could figure out COBOL very quickly after learning/using so many other languages. The hard part (as mentioned earlier) would be how to actually get at the code and edit it. I bet that sucks. I've been working with Linux again lately (command line yuck) and I can't stand it after using modern tools.

If you’re coding, you’re an honorary IT person. LOL.

It’s just a title.

I’ve met people who don’t work in IT who can code circles around 20 year professionals... and I’ve met official tech workers who had to bash their face in with a facepalm that non-pros wrote ten thousand lines of code instead of implementing a simple proper mathematical sort that a formally trained engineer would have used that uses 100 lines of code. LOL.

The harder part for someone such as yourself would be convincing someone to hire you to touch a system that pays hundreds of thousands of people without some assurance you actually know what you’re doing, right?

“I read COBOL in 24 Hours and stayed at a Holiday Inn last night! Lemme in there, coach!” Heheheh.

We’ve hired what I would call young “street coders”, the free agents of the software world... but usually based on something like a massive resume of projects and access to their personal github stuff where we can look over their work and see where they do well and where they’re missing critical computer science concepts so we know how much pair mentoring they might need from a senior developer who can help them learn those things or just survive a regimented corporate release process and model that they likely don’t have when working on personal stuff.
 
Didn’t we go through this pre 2000 to fix the 2 digit date problem?

In 98/99, my then girlfriends dad made good money putting his rolodex from a 1980s job to use. He would tease the geezers who still knew how to program cobol out of retirement and put them on projects to make legacy code Y2k tolerant. He himself didnt know much about programming, but he was a good salesman and able to put the right people together. Cobol was a dead language back then, suprising that there is still code out there.
 
ROTFL. If you think linux command line is yucky, you wouldn't last 5 minutes with Cobol. Even the newest flashiest IDE isn't much faster than vim if you know what you're doing. Don't get me wrong, the new IDE's are nice, but I'd hardly notice if you took it away and I had to use vi. Kinda like going from glass panel back to steam gauges.

Looks like VSCode has COBOL support now. LOL.

What doesn’t VSCode have support for, really? Ha. That crazy thing is a Swiss Army knife right now.

I’m all about old school Unix and vi(m) but even I fire up VSCode now for things. It’s just, nice.
 
Looks like VSCode has COBOL support now. LOL.

What doesn’t VSCode have support for, really? Ha. That crazy thing is a Swiss Army knife right now.

I’m all about old school Unix and vi(m) but even I fire up VSCode now for things. It’s just, nice.
I have been using it some to learn Python lately. It is a nice product.

You can also get it for Linux and on a side note, get PowerShell for Linux to. MS is going all in on Linux for some reason.
 
They taught me Fortran in engineering school in the 1960s, on an IBM 7094. Then I decided to write a program to calculate pi, and it got to a few hundred places on a five-minute student pass. Next I took an elective course in assembly language, and was able to calculate it to a few thousand places. I didn't check all the digits, but I still have pi memorized to 15 digits.

I actually use pi in engineering calculations. And if I'm doing them by hand, accuracy probably isn't that important and I'm just dumb checking to see that a junior hasn't screwed up a decimal point. So for most of my purposes it's 3.
 
I have been using it some to learn Python lately. It is a nice product.

You can also get it for Linux and on a side note, get PowerShell for Linux to. MS is going all in on Linux for some reason.

Price and scale is why they’re all in. I suspect they realize they’ve got the desktop market and with stuff like Azure and distributed Active Directory where a mobile laptop can be fully corporate controlled, as well as their new online machine reimagining thing for those who’ll pay for E3 and higher... they just want to “embrace and extend” into custom Linux integrated completely with AD and Powershell so servers in Azure are running either their completely stripped Server 2019 that has NO GUI at all, or a Linux variant.

They’re actually building a really decent back end in their cloud offering while Apple continues to ignore the corporate needs for command/control and audit abilities needed to even get certain customers.

Telling a fleet of Windows boxes to centrally log events and stuff like that is a mouse click with global AD. It’s a massive chore on Apple or “normal” Linux.

Be interesting to see where they go with it. Downside, it’s MSFT and definitely not cheap.
 
I'd rather work on COBOL than work on C that was written by a COBOL programmer who never learned what a pointer, or even a proper function is. Holy crap, that's some bad code. I printed out a program once when my boss asked me to review it to see if we could reuse any of it. I seriously thought the print queue screwed up because the same 100 lines of code were in the stack 30 times, but it turns out the programmer just kept copy and pasting instead of writing a function. In his defense there were more comments than there was code. (No, that's not really a defense)

But, you'd have to pay me a LOT, up front, to consider doing either.
I put in a lot of comments. Because I may have to go back in and fix what I did. Programmers sometimes do cruel things to each other, and not intentionally.
 
Borland had a nice system for C. (And Pascal. Turbo-Pascal.)

I'm largely self taught in C as well. I tell people it's like learning to drive in a formula 1 car. Most folks will crash and burn, but the ones who survive are probably pretty good...
I first used Borland, which was lovely, then had to switch to VC for the next job. The entire Borland setup was far superior. (Kind of like how OS/2 Warp made windows look like crap.)
 
I put in a lot of comments. Because I may have to go back in and fix what I did. Programmers sometimes do cruel things to each other, and not intentionally.
I used to. Then I realized breaking up the code more logically is better than a comment. And it doesn’t lie to you like a comment does when someone changes the code and doesn’t change the comment to reflect the change. Code is the only thing that tells you what is happening. Comments are often lying to you and even more often cryptic and useless.
 
Never heard of it and I’m not a programmer but I’ll give it a shot. What’s the worst that can happen?
 
I used to. Then I realized breaking up the code more logically is better than a comment. And it doesn’t lie to you like a comment does when someone changes the code and doesn’t change the comment to reflect the change. Code is the only thing that tells you what is happening. Comments are often lying to you and even more often cryptic and useless.
I’ve worked with maintenance programmers who’d run comment strippers on source so they wouldn’t be mislead.
 
I used to. Then I realized breaking up the code more logically is better than a comment. And it doesn’t lie to you like a comment does when someone changes the code and doesn’t change the comment to reflect the change. Code is the only thing that tells you what is happening. Comments are often lying to you and even more often cryptic and useless.
Ding ding. If your code needs comments then the code is unclear and needs to be refactored. This is normal. The difference between the amateur and the pro is how often, how quickly, and with what strategy they refactor. By the time I submit a patch of code for review I’ve probably refactored it half a dozen times. The amateur wouldn’t dare refactor their code for fear of it breaking and not understanding how it really even works.

There are some situations where comments are justified , but they are very very few, and should describe why something is being done..not what is being done...for example “//this operation is tried twice because the api is unreliable, see Vendor XYZ ticket #21740.”

A professional engineer on my teams likely only adds about one comment to the code base per quarter.

If you can’t figure out what the code is doing without the comment then it’d never goto production on any of my teams...everything is code reviewed and must meet published standards...I even have a review process that is done on the reviews themselves :)
 
Last edited:
One of the other sites reminds you that if you can answer the call for COBOL programmers you probably need to make sure you've scheduled a colonoscopy.


IDENTIFICATION DIVISION.
PROGRAM ID. RONS PROGRAM. yada yada.

I left working for Martin Marietta right after they found out that not only did I have a security clearance and know COBOL, I knew the specific dialect (ACOB on an UNIVAC 1100) that they needed for a real boring DMA project coming up. I jumped ship to go work on a Supercomputer UNIX port for the Army.

PL/I is an odd beast. A more modern structured language that maintained all the COBOL I/O formatting rules.

Twice in my career (once at the Army and once as a university admin for Rutgers), it was my job to get rid of all of the card processing equipment.
 
Ahh, comments.

I tend to get into a lot of trigonometry in the s/w I write, so I like to document the whole algorithm I’m trying to accomplish since it can be pretty complex. Write out the algorithm with bullet points, then follow that up with the code itself and the corresponding bullet points in the comments next to each section. I don’t believe in “self documenting” code, I’ve been around too long.

Two of my favorite comments I’ve seen over the years were on legacy systems I had to maintain:

One was in C, and the only comment in the entire control system was:

h = x; // hypotenuse

The other was written in Forth and had a section commented out that said, “Harry’s wife had a baby”. Apparently the story is that Harry got distracted and this section of the program never did work correctly so the next guy commented it out and re-wrote it. It was also the only comment in the whole program.
 
I comment code in places where the code may be perfectly obvious, but he reason for it isn’t. When you’re writing real time stuff for a microcontroller, there are often many ways to do things... and when you discover that one works much better than the others, it pays to remind future you why you did it this way and not the other. I was reminded of that a few months back when I had to go fix some minor bug in a program I hadn’t touched in a few years. “Why on earth would I have done th... oh, yeah, that’s right. Now I remember.”
 
Ahh, comments.

I tend to get into a lot of trigonometry in the s/w I write, so I like to document the whole algorithm I’m trying to accomplish since it can be pretty complex. Write out the algorithm with bullet points, then follow that up with the code itself and the corresponding bullet points in the comments next to each section. I don’t believe in “self documenting” code, I’ve been around too long.

Two of my favorite comments I’ve seen over the years were on legacy systems I had to maintain:

One was in C, and the only comment in the entire control system was:

h = x; // hypotenuse

The other was written in Forth and had a section commented out that said, “Harry’s wife had a baby”. Apparently the story is that Harry got distracted and this section of the program never did work correctly so the next guy commented it out and re-wrote it. It was also the only comment in the whole program.

I loved Forth!
I wrote an entire quality control testing system in Forth.
Sadly, we "sold" the entire system to IBM Burlington in 1984, and I never wrote another line of Forth.
 
Favorite comment, one I found deep in a research airplane's control law [Fortran] code:
Code:
c this didnt used to be here

With an empty line below.

Nauga,
who didn't used to be there now
 
I comment code in places where the code may be perfectly obvious, but he reason for it isn’t. When you’re writing real time stuff for a microcontroller, there are often many ways to do things... and when you discover that one works much better than the others, it pays to remind future you why you did it this way and not the other. I was reminded of that a few months back when I had to go fix some minor bug in a program I hadn’t touched in a few years. “Why on earth would I have done th... oh, yeah, that’s right. Now I remember.”
Yeah, there are some strange things I, and others, have had to do because of h/w idiosyncrasies. There are some chips that initialize in strange modes, or internal registers that have certain behaviors. Multi-channel A/D chips that have make-before-break issues when changing input channels, ... I don't do "IT" work, it's very hardware dependent and there are a lot of details that need to be documented. It's more robust to put that info inside the s/w to explain why certain things were done so that 10+ years later, when a new chip or other piece of h/w is used and something doesn't work anymore it's easier to debug.
 
I date back to paper tape and 110 baud acoustic couplers but never had the pleasure of Cobol. My first project out of school was a Fortran interface for a 9-track tape reader and a punch-card reader so we could convert all our punch decks to mag tape. My career as a programmer lasted 9 months, with 3 of those waiting for my check in the box so I could transfer.

Nauga,
onward and upward
 
Yeah, there are some strange things I, and others, have had to do because of h/w idiosyncrasies. There are some chips that initialize in strange modes, or internal registers that have certain behaviors. Multi-channel A/D chips that have make-before-break issues when changing input channels, ... I don't do "IT" work, it's very hardware dependent and there are a lot of details that need to be documented. It's more robust to put that info inside the s/w to explain why certain things were done so that 10+ years later, when a new chip or other piece of h/w is used and something doesn't work anymore it's easier to debug.
Reminds me of a time about 25 years ago, when my boss had me (an EE) work with our contract programmer to solve a problem with a system that was attempting to use a trackball to control the position of a stage under a microscope. The motion was jerky and unpredictable, which made the system totally unusable. I took a look at the specs on the motion-controller chip and saw that it had different modes. His software was using a mode that had feedback, which theoretically should have given the best precision in positioning the stage. I came to the conclusion that the electronic hardware didn't have the performance to keep up with real-time movement of the trackball. The chip also had a more primitive mode, which just used the raw data from the trackball without any fancy processing, so I suggested that he try that, and it worked. This method basically relied on the user to provided the positional feedback instead of the chip. It wasn't perfect, but at least it was usable. (Whether he commented the code to explain the reason for doing it that way, I don't know.)
 
My intro course to programming was in FORTRAN as well. How did we input things into the system?

fortran.gif


Woe to he who dropped his card deck.

We'd put them in the card reader, then have to wait some number of minutes for the printer to spit out our program and its results on this stuff:

41u12ee4fDL._SX300_.jpg


Usually what it told you was that you'd made a syntax error.

That's what columns 73 through 80 were for. Numbering the cards so if you dropped them a card sorter would fix the problem quite easily. But, punch cards suggest the use of a main frame. I started in 1969 as a senior in high school programming on an IBM 360/67 using the WATFOR compiler.

When I was a kid, I got a tour of one of those facilities in grade school, and card readers were already on the way out. I learned Fortran on a Vax VMS system.

The last machine I used to write and update FORTRAN jobs on was a VAX 11/780. I really liked the VMS operating system. It was well behaved and easy to use. Much better than the NOS operating system on a Cyber 176, especially as administered by Martin Marietta Data Systems. Talk about user hostile. I don't miss them at all (and I left in October 1983).

I tried to get SPICE up and running on a Tandem system, but found out that the compiler was pure FORTRAN 77, with no extensions. Did you know that "Real*8" is an extension and not part of the standard language. Our compiler people (I worked for Tandem at the time) would not add anything to the compiler, and I wasn't about to re-write 5000 lines of code to get around the need for that extension to the language. Hence SPICE never ran on a Tandem machine. That was back in the early/mid 1980s. I haven't touched FORTRAN since.
 
Several decades ago, a big NJ insurance company needed lots of COBOL programmers. They put together a training program that could transform someone with zero programming or technical experience into a “professional” COBOL programmer in 30 days. Secretaries, accountants, used car salesmen, bartenders - all became programmers.

It’s not rocket science...
 
I think our database is either ancient FORTRAN or COBOL mainframes with a 80s text interface that has a “modern” html website as lipstick.

I’m actually the only one in the office that knows how to use the 80s text interface. The hilarious part is that the interface is accessed via a IE webpage. Half the time the text interface is easier to use then the asininely designed html website.

As an example, I needed to compile a list of all the crop dusters in the state last year. My buddy starts to show me how to run a report on the html website which requires several complicated steps including exporting it to PDF. I stopped him, logged into the actual mainframe, did a few keystrokes, and out popped a “PDF” printout of all the crop duster operators. He just looked at me confounded. I guess I can blame my father for raising me in the engineering department of a tv station.
 
Back
Top