Micro Focus is now part of OpenText. Learn more >

You are here

You are here

5 programming languages that are fading fast

public://pictures/Peter.Wayner.jpg
Peter Wayner Freelance writer
 

Computer languages never die, but they do fade away. For every slick, new, buzzword-compliant programming language that streaks across the sky like a rocket, radiating promises of theoretical grace and endless efficiency, there’s also a once-vibrant syntax that’s drifting down to earth, no longer cool, and largely used to keep legacy code from crashing.

This isn’t bad. There may not be a single new COBOL project being started anywhere in the world, but there are several hundred job postings from companies that need COBOL programmers right now. If you learn the language, you can keep a dusty deck of punch cards running for the next few decades.

In the past, we’ve celebrated the popular newcomers, and now, in the interest of fairness, we’ll summarize some languages that are, well, not mentioned as often as they used to be. This doesn’t mean that they’re worthless or bad. Not at all. There will still be projects and jobs that use them, but by all indications there will most likely be fewer. Working with these fading languages will be less creative and more curatorial. 

Why did they lose their luster? Often, they're just a bit too complicated, and the newer languages let you accomplish more with less grief. Sometimes they do things that aren’t as necessary, such as keeping a PDP-8 running. Occasionally, the folks maintaining the code let it go and the entire stack starts to fail. There could be any number of reasons for a language's decline.

This list is purely my subjective observations, and it will undoubtedly be greeted with howls of protest from the folk who are still loyal to their favorite ways of expressing a loop. They may say that nothing works as well as x. They may say that the code they wrote in one of these languages is faster and more stable than anything current today. They may be right. But I see the market shifting away from these languages.

Perl

The simple, efficient syntax was once loved by all of the command-line hackers, who called Perl the Swiss Army Knife of their server farms. If something needed doing, it often took just a few cryptic keystrokes to get it done. Just tap, tap, and tap again, and some gnarly file is reformatted to be just what we needed.

Perl is losing ground to dedicated server maintenance tools like Chef and Puppet for keeping our machines running. Instead of writing generic shell scripts to configure machines, the code used by tools like Chef and Puppet are optimized to automate most of the system chores.

When it comes to simple scripting, Python has overtaken Perl because it’s easier to read. DevOps teams may not care about simplicity, but all of the biologists and social scientists love it, and they’re increasingly using Python to chew through their data. This vital community is sucking some of the energy from Perl.

The old-school hacker culture is disappearing, and it is taking some of the love of tight, cryptic code with it. Maybe it’s just because the last few generations have been raised on Java, the language that asks us to pack two paragraphs of description into a CamelCase method name. Maybe it’s just because RAM and diskspace are so cheap that we can write our code with super-long variable names and prolix syntaxes. Perhaps it’s just because no one uses HP calculators and RPN any longer. Who knows?

But that doesn’t mean it’s disappearing. Curtis Poe, a Perl developer, said, “Devs may not like the fact that 25-year-old Perl code still runs, but companies do.”

.NET

Dig around your desk. Underneath the papers and the dust is a box with a plastic slab covered in buttons. If you look closely, you’ll see that there are letters on the buttons, and the letters are arranged in the same pattern as the smartphone screen you use to send text messages. The slab is called a keyboard, and the box is called a desktop PC. Your grandparents used the box and the slab like a smartphone, even though they couldn't fit in their pocket. Life was tough in their day.

When the PC ruled the internet and Microsoft ruled the PC, the .NET programming world was a great way to build user-friendly applications for the platform. C# is arguably still better than Java, at least in some arcane ways. Visual Basic is still pretty easy to start using.

But none of that elegance or simplicity matters now that everyone uses smartphones. Windows is receding, and code for .NET is becoming less and less necessary. It’s simpler to just write a web app with a mobile skin that will work everywhere. Why bother creating something optimized for the dusty old PC? 

Flash and Flex

Adobe’s tools once produced the prettiest, slickest graphical presentations on the web. While the web designers were arguing over putting some text between <h1> and <h2> tags, Flash developers were rendering the words in custom fonts and sending them dancing across the screen. The algorithms offered good anti-aliasing support, so it all looked great too. When video came to the internet, Flash and its language, ActiveScript, were there first.

Alas, Flash crashed a bit too often. Maybe it wasn’t Adobe’s fault. Maybe the browsers didn’t handle the memory allocation perfectly. Maybe the OS vendors had it out for Adobe. But the more it crashed, the more users started hating Flash. Some even began writing browser extensions that would block all Flash code.

Now the browser can use HTML, JavaScript, and CSS to deliver almost everything that Flash and Flex could provide. The canvas object offers arbitrary drawing. The video tag can display moving pictures in a rectangle. The platform is cleaner, simpler, and even occasionally as elegant as some of the Flash presentations of yore.

Nicolas Demange, a web developer, said he’s always loved Flash, as well as its larger cousin, Flex, saying it was “the most efficient and powerful language for advanced UI on the web.” Sometimes, when he’s in the middle of a new project, he wonders if Flash, Flex, and ActionScript are still more powerful than HTML5.

But nostalgia won’t bring back the old way of coding. What went wrong? “Steve [Jobs] killed it with his anti-Flash-Player war,” Demange said. Jobs is gone, and now he’s taken Flash with him.

C 

It seems impossible to toss aside a language that forms the foundation of UNIX and printer drivers everywhere. The OS, after all, forms the foundation of smartphones, desktops, servers, and almost everything else. It’s hard to find a refrigerator, car, or microwave oven that doesn’t rely on C code at the base of its stack.

But the fact is that, aside from the maintenance of these aging operating systems, interest in C is fading. Most students learn Java, Python, and JavaScript as their first, second, and third languages. Sure Harvard’s famous CS50 teaches C, but Harvard also clings to teaching ancient Greek and that upstart Latin, too. Aside from the nostalgic folks, fewer and fewer programmers even know how many asterisks to put next to the variable name to make the pointer thingies work right.

Apple’s Objective-C was a bright spot for C because it was the native language for the iPhone, which drew in millions of developers. But in the end, Objective-C is a language that — when it was introduced — was an attempt to put some rouge on the corpse. Now the gold rush in the App Store is over, web apps don’t need to ask, "Mother may I?" in Apple's inscrutable review process, and Apple has built Swift, a nice, up-to-date syntax for the modern iOS programmer.  

Mike Hendrickson, the vice president of content strategy at O’Reilly Media, says that even though there’s a slow, general erosion of books on C, the language will still be useful for those hacking hardware. He said that interest in C “will likely continue as the Internet of Things continues to grow. Those languages perform well on the metal, so to speak.”

The sales of the books from the O'Reilly line are a good proxy for interest in certain computer languages. (Source: O'Reilly Media)

The world will need a few brave souls to volunteer to keep the various UNIX kernels running long into the future, but the rest of us can think of C like Lenin’s tomb in Red Square.

Assembler

PUSH, POP, MOV, ADD. Assembler programmers weren’t shouting or ignoring a stuck caps-lock key. They didn’t have lower-case back then. 

“There actually used to be entire applications written in Assembler — even games,” said Richard Conto, a UNIX programmer. “These days, assembler is (rightly) limited to specialized tasks.”

If C, a bare-bones language that hackers called “portable assembler,” is fading away, then the real assembler is all but forgotten. It’s too bad. There’s no better way to understand why an algorithm is running slowly than to trace it at the assembler level. Are you swapping too many numbers in and out of the registers? Is your code twiddling its thumbs because of a cache miss? People writing a high-level language will never know the answers to questions like that. And their code will never run as fast as the bit-banging, register-reusing wonder that was toggled into the front panel.

Not that it matters. Machines waste more cycles than they use, but the world has computational power to spare. We can wait for interpretive languages to run because the “wait” can be measured in microseconds. And then there’s the question of safety. High-level languages and their compilers can catch more mistakes. Sure, the assembler writing gods of the past never made these mistakes, but we would.

Please feel free to rant against (or wholeheartedly agree with) my opinions in the comments.

Image credit: Flickr

Keep learning

Read more articles about: App Dev & TestingApp Dev