Assembly Language Keeps Rising

shutterstock_332111609

Assembly language is a very low-level programming language for niche platforms such as bootloaders, device drivers, and embedded devices. In the minds of many, it’s the sort of language that computer-science students will cover in the course of their classwork, and never use in their actual jobs. But as demonstrated by the TIOBE Index for July, Assembly language has enjoyed a steady rise in the rankings of the most popular programming languages.

Assembly language now stands in tenth place, ahead of even programming stalwarts such as Ruby, Visual Basic, Swift, and R. It is also well ahead of highly specialized languages such as MATLAB and Groovy. (TIOBE uses programming-related traffic from search engines and Websites such as Wikipedia and YouTube to generate its rankings.)

“Why would anyone write code at such a low level, being far less productive if compared to using any other programming language and being vulnerable to all kinds of programming mistakes?” read TIOBE’s note accompanying the rankings. “The only reasonable explanation for this is that the number of very small devices that are only able to run assembly code is increasing.”

Even toothbrushes and coffee machines “are running assembly code these days,” the note added. “Another reason for adoption is performance. If performance is key, nobody can beat [assembly language].” Assembly language is notable for its one-to-one alignment between the language and the machine-code instructions (ISAs) for particular computer architecture; this is a world of difference from other programming languages that can operate across multiple kinds of architectures.

It’s difficult for a human to get closer to the heart of the machine than via assembly language. For that reason alone, it’s a worthy subject of study for anyone interested in how CPUs and programming actually work. For working programmers, it’s also clear that assembly language represents an efficient way to build new products, especially as more and more household objects end up with processors onboard.

9 Responses to “Assembly Language Keeps Rising”

  1. Dan Sutton

    Good news for me, then, since that’s where I started.

    Note that there’s no such thing as “Assembly Code”. There’s “Assembly Language”, which is “assembled” (we don’t use “compiled” because the process is one-to-one instruction to op-code, thus the normal types of compiler function such as optimization don’t apply) by an “Assembler” into “Machine Code”. The phrase “Assembly Code” is an oxymoron.

  2. David Jonathan Bush

    Back when I wrote games for the Motorola M6809, each CPU model had its own instruction set. Lance Leventhal would write an excellent book about each one. I wonder how many different languages are lumped together here under the same “Assembly Language” banner. Does an Athlon use the same language as an Intel? It seems unlikely. I wonder what a statistical breakdown of the most popular CPUs and GPUs would look like.

  3. Dan Sutton

    @David Jonathan Bush: Athlons, Pentium derivatives and other 64-bit Intel chips use an extended version of the old Intel 8086 16-bit instruction set, involving new instructions to handle larger register sizes, operations, etc. What tends to happen is that someone like Microsoft swears allegiance to a manufacturer’s standard (in the late ’90s it was AMD; now it’s Intel) and other manufacturers had better make sure their chips conform to the standard that the new version of, say, Windows is going to conform to. So yes: an Athlon uses the same instruction set as an Intel. Manufacturers can/could build in extensions to their own base instruction set, but this tends to be counterproductive since it leads to incompatibility in terms of software development.

    However, this article appears (to me, at least) to be talking about different chipsets altogether, such as ARM, or whatever, which might be embedded into small devices: the machine code for those systems is entirely different, since the chips’ architecture is different: we’re talking about a RISC-style thing without the complex instructions (CISC) that Intel/AMD chips use.

  4. One thing that held assembly language back was lack of structure. Macros can however be used to form nestable program structures and a lot of other abstractions, dramatically raising the level of the language, programmer productivity, and code maintainability, and reducing bugs, all with (in most cases) zero penalty in run speed or memory taken since they’re just assembling the very same thing you would write out by hand if you didn’t use the macros. I used a nice macro assembler starting even 30 years ago, but I’m still finding out just how much can be done with one, things that even today have not occurred to most assembly-language programmers. I wish I had figured all this stuff out back then.

    BTW, thankyou for correctly calling it “assembly language” rather than “assembler,” as “assembler” is not a language but rather the piece of software that takes the assembly-language source code and produces the machine language that the microprocessor understands.

  5. by the time you wrote this, your ‘stalwarts’ werent stalwarts at all.

    Ruby was already on its way to becoming the next perl (with only real jobs being maintaining existing legacy code), Swift just came out and was brand spanking new — and was really only popular because of walled garden phone systems which pushed it on people (and objective C before that — lets be real, if given a real choice, people wouldve used bog-standard C/C++, but it helped keep the apple ecosystem locked down and software less portable)

    Visual basic has long been on its deathbed since the REAL BASIC interpreted language died out. its just .NET with crappier, less powerful syntax, and C# filled the void — this immensely displeased non-talented, barely self-educated programmers who really didnt know the first thing about computer science as many became quickly unemployable unless they actually learned the shit about C that scared them off in the first place (sans memory management)

    really VB.NET is a return to what basic was meant to be — not a professionally used language, but something for newbies to play with at home who probably never were going to have a programming career or hobby in earnest. unless they were below the age of 15. then it was an acceptable start — but today we have better languages for that really — id push python on a kid before rotting their brain out with BASIC and if they couldnt hack it i’d tell them to quit and find another hobby.

    and R? what the hell is R? Who programs in R? Ive never met anyone or been on any programming chat room where it was mentioned.

    ASM is slowly teetering out in terms of jobs and use. its no longer used in games and at this point pretty much ONLY used for bootloaders and the rare thing that cant be done in C. or on really cheap embedded hardware (and even thats teetering out as the cheap stuff is quickly becoming fast enough to run C)

    i love assembly language. and C/C++. its really tough to get me to move away from older paradigms of complete control and extra effort required. and im not even that old! and im not a professional (although i have limited CS education up to AS, discounting any and all math)

    ive just done this as a hobby for a long time. it took me awhile to begin to grok ASM, and in a way its sad its not as widespread used as it used to be.

    there are many cases it is still used — such as the cheapest of the cheap embedded stuff (microwaves, coffee pots) but there its extremely simple because your basically just turning stuff off and on and not writing entire programs (unless your building a hombrew PC kit or something and actually writing software for users to run, not to manage hardware states — like say writing tetris in assembly), or at the lowest levels of hardware design (building CPUs and architectures), or in compilers, VMs, etc.

    theres places for it. but only the most hardcore of places. Really a big thing was, moores law improving the performance of CPUs enough where you could lose a good amount of performance in addition to management not liking to hire extremely high-paid and hard to fire or replace wizards who were really good with it on their respective platforms and also the ability to hire more, less skilled programmers to work on a problem and get more productivity out of it.

    if you really wanna be a programmer, im seriously gonna reccomend you drop all the trendy, easy, super high level languages and learn some real CS. Go pick up Rust, C, C++ even. hell if your gonna stick with interpreted languages, go python, lua, or… anything else really.

    if you want .net learn C#.

    visual basic .net is a dead language professionally, and sticking to simple shit thats platform locked may seem like a great idea now, with an uneducated apple retard ecosystem who loves to throw money at stuff thats free everywhere else or excessively cheap — but your gonna end up vendor locked yourself without understanding the base concepts which allow you to ‘easily’ switch to brand new languages youve never even heard of before.

    Like idk what the hell R is, but i could learn it in a few days so long as its not functional (then it might take a few months/weeks). You could hire me on tomorrow and i could just figure it out without ever having learned it, or even a fancy CS degree.

    and thats because i have the CS knowledge. If theyd let me take classes without any prereqs i could go pass a final in a few weeks of study not even taking the damn class.

    and you know why? because i never took it easy on myself. I never went with the trends. I always try to solve problems the hard way, and do it in either the most performant, or the most portable way every time.

    thats why i hated objective C. i was never gonna write code solely for apple products. even if that means sacrificing a large portion of money i couldve made from programming without trying to get hired on as an official dev.

    the fact is though, that i COULD do it if i wanted. im not limited.

    because even in 2016 i knew visual basic was dead. fuck last time i used it i was like 10 years old or so before sept 11’th even happened yet.

    back then it was semi-relevant, as industry had no way of creating ‘quick internal business apps’ or prototypes from lower-wage, lower-skilled programmers very quickly.

    today there are much better compromises available to achieve the same ends, and the nice thing is, theyre much more similar to other languages and can easily be ported in a matter of days/hours to another language.

    Like C# is different, but its not that hard to port your C# program in entirety, even a big one, to C++. its just like, add in manual memory management and test each function as you translate. boom. optimize and roll out for release.

    • Dan Sutton

      There’s actually something more – as well as hardware becoming fast enough to run C programs, which is true, C compilers themselves have improved boundlessly. For example, I tested the Microsoft C++ compiler in Visual Studio 2017 against my own assembly language solutions to various problems, and it tended to write almost op-code for op-code exactly what I wrote manually – in some cases it was more efficient because it would have, say, two exits from a loop because that would save one processor cycle in some cases. Fantastic compiler optimization. At this point, it’s rare that writing something in assembly language saves you anything at all in terms of runtime speed.