5 Programming Languages You Won’t Use by 2030

Programming languages fade. It’s an inevitable fact of life: The old and tired is inevitably replaced by the new and better (or at least slightly different). While it’s impossible to predict which languages will indeed endure over the next decade (except Python and JavaScript; we’ll bet both of those are sticking around), it’s worth asking which languages are on a trajectory to oblivion—after all, you don’t want to burn time learning something that won’t have utility for much longer.

Perl Will Fade Away

Yes, Perl has its adherents. Every time an article rolls out that declares Perl on the endangered-languages list, fans point to the number of open Perl jobs currently listed, and how there’s still an active developer community around Perl and its potential evolution into something new and interesting.

Yet RedMonk and the TIOBE Index both show Perl in decline—and while you may take issue with how either site ranks programming languages, if their varying methodologies arrive at the same conclusion, then it’s safe to say that something is actually going on here. Adherents like to point to Perl’s solid performance and its ability to scale (hence its lovely nickname, “The Swiss Army Chainsaw”), but other languages—most notably Python—have begun to eclipse it with regard to adoption, libraries, and features.

We’re betting that, over the next ten years, the number of folks using Perl will decline still further, putting the language at serious risk of fading away entirely.

Why Will Objective-C Die?

Apple’s original programming language has been alive for 36 years, and it seems certain that some legacy apps will still incorporate it 10 years from now. But the chances of you actually working with it by the end of the decade? Pretty much zero, unless you land a job maintaining some obscure iOS app that a company absolutely refuses to update under any circumstances.

Apple is dedicated to ensuring that Swift, which launched in 2014, totally replaces Objective-C. Despite that push, however, Objective-C has managed to maintain much of its popularity and market-share. Why is that? The explanation is pretty straightforward: Swift didn’t emerge fully featured, and its creators have spent the past five years adding crucial features such as ABI stability. That’s encouraged many iOS and macOS developers to stick with Objective-C for the time being.

As Swift becomes more robust, though, the use-case for Objective-C will fade. Over the next ten years, companies will also re-write big chunks of their iOS/macOS codebase in Swift, lessening the need to maintain Objective-C legacy code. If you’re interested in building for Apple’s ecosystem, now’s the time to learn the various features of Swift, including Strings, Arrays, and Package Manager.

Ruby Will Go Bye-Bye

Last year, our analysis of Dice job-posting data showed a significant dip in the number of companies looking for technologists skilled in Ruby. Coding Dojo stopped teaching Ruby on Rails, the popular language framework. And Ruby has been falling on the various programming-language rankings, including IEEE Spectrum. To paraphrase The Simpsons, Ruby is probably in some serious long-term danger:


Why Is R Dead?

Data science is only becoming more crucial to many companies, leading to a spike in job postings for data scientists, data analysts, and data engineers. Core programming languages to data science include Python, R, and SQL—but there are a lot of signs that Python has begun to swallow up R, which was developed for statisticians and data analysts.

“Behind Python’s growth is a speedily-expanding community of data science professionals and hobbyists—and the tools and frameworks they use every day,” read the latest edition of GitHub’s State of the Octoverse. “These include the many core data science packages powered by Python that are both lowering the barriers to data science work and proving foundational to projects in academia and companies alike.”

For years, various sites have tracked Python gaining ground at R’s expense. In 2018, a KDnuggets poll traced that decline in R usage in favor of Python (especially among tech pros who utilized both languages). Meanwhile, a separate survey from Burtch Works revealed that Python use among analytics professionals grew from 53 percent to 69 percent over that same time two-year period, while the R user-base shrank by nearly a third.

There’s no reason to think this trend won’t continue over the next few years, putting R at serious risk of being completely supplanted by Python.

CoffeeScript is Burned

According to codementor.io, CoffeeScript (a programming language that compiles to JavaScript; yes, we’re counting it) ranked among the worst languages for community engagement and growth in 2019. “CoffeeScript’s Facebook community was nonexistent this year and it ranked near the bottom of the pack for Freenode IRC, Twitter, GitHub, and Stack Overflow tags,” the website wrote in a note accompanying its data. “/r/coffeescript, unfortunately, also had the dubious honor of being the least popular subreddit on our list of languages.”

CoffeeScript has also experienced a decline in Google Trends. Barring some remarkable event that returns it to relevance, it seems unlikely that CoffeeScript will be broadly used within ten years.

53 Responses to “5 Programming Languages You Won’t Use by 2030”

  1. Johnny L. Hopkins

    The R programming language still has a strong user base and is one of the languages recommended for data science/statistics students to learn alongside R. A good data scientist understands the business problem, analyze the data for trends, and use the tool most effective in communicating results to stakeholders. R will not go away because it is open source and can be used within Python, C/C++, and other programming languages.

    • Yup, he is so wrong on R. I’d like to think he’s wrong on Perl, I’m a very old-timer, and while I have learned to use Python, Perl is my Swiss Army knife for sure. I’m glad he was wise enough to not put Fortran on the list!

    • Michael Szczepaniak

      Read the kdnuggets article in 2018 re R as well. Back then I wanted that prediction to be right to justify spending more time in python. My current job has forced by to be pretty fluent in both (R and python) now and expect python to be more dominant in the coming years. That said, R is incredibly rich statistically and is actually easier to wrangle data with IMHO thanks to the tidyverse (e.g. dplyr) libraries. I think anyone serious about data science should really have a solid command of both R and Python because R is not going away as fast as some may think.

  2. george bright

    Never declare a language dead. Perhaps less use, but with our Legacy world of it structures, nothing ever really dies. When I took my first job in the 80s, both Mainframes and Cobol were supposed to be leaving the IT world. Get to 2020 and both are still in use.

  3. Baked in Boise

    Good riddance on all but Ruby. Ruby’s ok, it’s just a little off the beaten path. Whatever happens, it good to keep up with the flavor of the day, as awful as it may be, so you can get the next job.

    • Ruby SHUOLD die. “There’s more than one way to do the same thing” means that the one later maintaining the code has to know them all. A beginner can program it, but it takes an expert to know what he did.

      • While Ruby allows MANY ways to do it, its real problem is also its greatest feature. In Ruby, you can fundamentally alter the characteristics and meaning of the most basic language capabilities. SO…after you execute a piece of Ruby code from ANYONE (such as in a gem), it can change the meaning of ALL Ruby code that is executed thereafter. The result is that ORDER MATTERS, and it is practically impossible to “engineer” a Ruby deployment for reliability. A good example, is Chef (OpsCode). I was part of their beta when they introduced their hosted solution. My coworker and I opened eighty (80) tickets against their “knife” tool and supporting infrastructure. Chef could not troubleshoot them because we (they and us) could not reliably document our Ruby installations sufficiently to be called “configuration management”. Chef soon punted, deploying a thick client version of “knife” that includes everything for Ruby in a controlled manner all the way down to “libc”. That is an extreme punt! Ruby is amazing creative, but it is an engineering nightmare. Remember that if you give any consideration to Chef as well. One cannot build a good deployment tool upon a language that completely resists configuration management.

  4. R is a child of ForTran…which has been around since the stone age of computers. Unless the paradigms of statistics and data analysis change completely in the next 10 years, there is no reason to believe that R will go away. It is a workhorse that is useful. The same was said of COBOL which seems to still be alive and well although with a much smaller reach even after 60 years….and there the paradigm DID change.

    • The irony of Cobol’s recurring obituary is that, as young people are increasingly turned away from it, those of us with gray hair can still command some mighty lucrative salaries! There a lot of old iron that’s not going away any time soon, or so it seems.

      • Heck, COBOL is being taught at large shops that still use mainframes – but they also use modern DevOps tooling (Eclipse-based IDE’s, Git, Jenkins, Groovy, etc) as well. The appetite (and budgets) for rip-and-replace just isn’t there. Hollowing out the core is where time and money is being spent.

  5. Steve Cohen

    I assume that these authors must think that Java has been dead for ten years since they don’t so much as mention it, but it was still going strong when I retired < 2 years ago.

    • Dhananjay Vemuri

      I don’t see Java , python , r , JavaScript expiring in next few decades as far as I see from data science , AI ,Machine languages open source libraries are heavily supported .. Thousands of jobs begging to be filled around these space.. However I agree objective c , ruby and Perl are so redundant .. and open source communities volunteers are just not there . I think it’s good to get rid of them.

  6. Barrie Schaef

    Not Dice, but every boss I had declared COBOL dead and when I offered to make JCL job changes – they said it was going away so don’t admit I know it! Still there in banks and our government!

  7. Delphi has to be dead already, good job Inprise/Borland, thanks for killing it.
    COBOL is slowly going away, I’ve seen shops tell them to learn .NET.
    Maybe not by 2030, but by 2040 I think Angular will die off.
    The last thing we need are more languages that convert to Javascript, please just go away.

    • Old Timer,

      I started in COBOL68 back in 1970 and have worked in banks, retail, insurance, etc. and now still going strong after 51 years with COBOL, which is alive on the internet today. Millions of lines of readable code entrenched for decades and proven reliable for many big corporations. Still coding COBOL for over a 1/2 century and other languages have come and gone. Just saying. LOL

  8. If the author was older, he would know that there are still coders maintaining numerous programs written in Cobol 50 years ago. Claiming that these languages won’t be used in 10 years is completely absurd. They may be used less but countless programmers will make a very good living maintaining legacy apps.

  9. Guidione Simiao Ndona

    Eu aprendí o FORTRAN em 1998, na faculdade, ainda o uso, certo que vou me actualizando em outras linguagens.
    Neste momento estou a entrar para o PAYTHON.
    Também acredito que nada morre, usa-se pouco, mas às vezes espreitamos para aplicar em novos…

  10. Christopher Fasano

    In 1980 I took a course in a language that was going to “take over the world” and no one would program in Fortran in the future. The language was PL/1. Fortran is still here and in use where it excels. PL/1 not so much. I love when pet write these articles. Languages will continue to exist as long as even a few people find them useful.

  11. Kirk Augustin

    All languages should be used when appropriate. If a language is well designed, there is no difficulty is using it by anyone, when there is some advantage to it. People should never concentrate on learning one particular language, but the commonalities of all languages. That way they will not be left behind as new and better languages are inevitably designed. But interpreted languages should never be considered significant compared to compiled languages that can have direct interaction with hardware and be used to implement memory management, drivers, loaders, and the whole operating system itself. I bring this up because of how Python is becoming taught in schools so much, when in reality it is a toy compared to C.

    • Andrew I Kaplan

      Having just retired after 50+ years in IT I can remember the difficulty when converting applications that did financial calculations to Unix based systems. The problem of getting something as simple as a margin calculation for a brokerage account required attempting to match a calculation done on an IBM mainframe in decimal instructions to Unix floating point. The industry required years to accurately calculate this seemingly simple task. Procedural languages were ideal for mathematical and arithmetic tasks still the heart of data processing. Obviously these languages provided no real tools for doing the kind of screen manipulation and interaction required in today’s online world, but the need to process the data still exists.

    • solomon d gongola

      I understand what you intended to say bjt ,aybe you should rephrase some of it.

      Which group would the .NET and JVM based languages fall into. They really are interpreted by a virtual machine but they do not actually interact with the hardware unless the virtual machine gives a way to to it.

      Compiled languages that can have direct interaction with hardware and be used to implement memory management, etc. really is limited to very few languages. Besides assembler languages (which you are not talking about), there is C. There was also Bliss and Burroughs Algol which are also gone. C++ and other languages do let you dynamically allocate/deallocate memory but they do not really have the same capabilities as C and assembler languages. They can’t access device registers or do operating system functions without doing it through an API or system call.

  12. What is COBOL? Well, it’s the greatest language ever designed. It’s easy to learn, easy to code, easy to read, easy to maintain. And it WILL NEVER DIE. So just accept it. Deal with it and make a ton of cash. Combine it with DB2 and MQ and you have yoself a game! It’s more fun than a bowl of chicken livers. I started using it in 1976 and never looked back. I’ll be using it till the cows come home. Hold on, I have a WORKING-STORAGE switch stuck in my teeth. OK, NEVER use switches unless absolutely necessary. And yes, “GO TO” is fine if used properly. Keep your chin up and a stiff upper lip. Speak softly and carry a big COBOL manual. Happy coding. (This message was not brought to you by IBM. It was brought to you by ISI.)

  13. anonymous

    R, like its predecessor Fortran and its business cousin, COBOL is not going anywhere. All of these have evolved and adapted to new tech. Who would have thought that 60 years would pass and the code and the language would still be around! Things change but fundamentals do not.

  14. Miles Porter

    I used to hate R. I used to think R was a general purpose programming language. I don’t hate R any more. That switch happened when I realized that R is NOT intended to be a general purpose programming language, it is more of a statistical package. The difference is subtle. If I were to build an app, a backend micro-service, or create a website, R would be a pretty questionable choice. However, if I needed to do a multiple linear regression model as part of data exploration, I could do it in R in about 4 lines. That would save time, and the presentation features in R Notebooks are great for explainability. R will continue to fade a bit, but it is not going away. The declines we see in R usage are part of the “trough of disappointment.” However, that does not mean R is going away. It is just being used less for things it is not good at.

  15. Dupes2k

    Agreed, retreaded articles on speculation that have a negative impact. Even their analysis is flawed. For example, using Stack Overflow articles to suggest python is more popular than PHP. Python is such a train-wreck that people are seeking help all of the time. Plus PHP has their own web site that provides answers, so less PHP developers use Stack Overflow.

    The only true test is to look at the number of jobs available as that’s all that matters. Do we see cobol, fortran or ada jobs? No we don’t, regardless if they are better languages. We see a lot of python jobs because python is a mess. Until more people abandon languages that suck, the jobs will still be there.

    IMHO, everything can be done with C and PHP. C for processing speed, PHP for a quick external tool with few lines of code. Anything else is just noise and wannabe attempts to do what C and PHP can do, pushed by haters.

  16. Xerxes Benton

    People still learn the Latin, a dead language, but its use as a true means of communication between two living breathing humans is not a reality, so maybe this is a better way to think about this article. Is anything new being created with Cobol and Fortran, like a new projects or apps? I do not think so, so they are just learned to keep these ancient apps running. When you guys say companies use COBOL, how many businesses still use it I wonder – could anyone take a ball park guess?

  17. Derek Emrie

    As a Grad student in data science, for one, Syracuse U. is still instilling R language concepts to students in Intro to Data Science, and since you can import SQL data, Python data, Excel data into R, why would it go away. Of course the double edge sword of “open source” and the, I believe around 5K libraries (packages) available for R studio makes it both amazing and often very challenging/time consuming keeping up with all that’s new (especially for those new to it all together). Stack overflow, R community, etc. tend to help, but for “newbies” some of the posts may be over one’s head. Syracuse is exposing students to Python in Natural Lang. Processing class, so I think Python is around for a good while. From a data science/analyst position, I have to say that a complete grasp of the business problem and anticipated desired results is at least as important as coding skills, why there are teams of scientist/analyst, each individual has his/her individual strong and weak points, some better at depicting use cases with graphical software, writing text for a paper/presentation, and then there’s the “master coders” to ensure the problem can get answered (or at least insights of varying ways to look at things, possibilities so to speak). Languages can be revived, think of the native American’s that helped win WWII by using their native spoken language to baffle the enemies, sometimes “retro” is good, lol!

  18. William Love

    Just to give the group some laughs … I make my living with VBA. Its because two programs important to my industry (Rockwell Software and Microsoft Office) use it for their macros. It is very lucrative, and fun actually. Interestingly, when I was getting my BS in Mech Eng 40 years ago Fortran was my goto (pun?). I noticed recently that “As of March 2021, Fortran was ranked 22nd in the TIOBE index, a measure of the popularity of programming languages, climbing 12 positions from its ranking of 34th in January 2020.”

    • Rich Sandness

      There are some (I think “funny” comments) here, I have to say, but when I saw that about “R”, now mind you, I’m nowhere in the saddle as some of you are, … and no where close, but as a “distant” Peanut Gallery watcher at the opera, so to speak, … I think that what is important to see is that what Nick is talking about is “2030” (Hello, is anyone home?) And yes, I know that most of you are much more than me, but the deal might be, many cannot discount the next as though it’s Jack in the Box Trend gone, nor did Nick infer that, to his credit, and to be fair. Nick did not say “2024”, or 2026, etc. He said “2030”. As I look around, the letter “R” (as well as in my name) still has plenty of proclaim, … while also (and I am in no-wise challenging Nick), but, … it is a software with a statistics “Niche”, which, … well, a lot of companies appreciate, and who knows if “something new” might come thru, as sort of a “merge or combo” of “R” that will possibly go all the FAR (Ther), or some new aspect application of it might arise? The “Future” is not always that easy to determine, nor to Predict. So,dwarf that I am to most here, I would just say that while the “MIN” might be predicted (and even accurately so), the “MAX”, might Also rise, by symptoms (or convolutions), beyond one’s Logic known. (Can I say? Nine (9) Years from today, is a bit of a way, however, from a “Pythonic View”, I’m not discouraged, and find Nick’s assessment every bit “more than just a little” interesting, partly because “R” seems like a smaller bandwidth, yet, that is not always “a bad thing”, … per see. And tending towards “Opinionation”, or “Opining”, will not “Divining”, … Just thought to say.

  19. Rich Sandness

    It appears, from what I’m seeing, obviously, the thoughts about “R” were taken (to my perception) as those most reverberating. Again, I would just say, 9 years away, is nine years away, and Nick K. did not say, … otherwise to that, Thus, as always, “scope” and context, are important to enter into Equa’ … tion.