Implementing simple sort algorithms in ARM Assembly (part 2)

I haven’t completed the code yet, but I wanted to share my progress learning ARM assembly by implementing a simple sort algorithm (part 1 is here). I’m committing my changes as you go so if you’re interested you can also pull the code form github here.

The simple sort that I’m implementing is a ‘comparison sort‘. You start at lowest end of the array of values, iterate through to find the smallest value and then switch the smallest found value to the front. You then repeat the loop starting at the next index in the array, search again for the smallest, switch, and then continue repeating this until you’ve looped through and compared all values.

I’ll make clear that as I’m learning ARM ASM I’ve no idea at this point if my approach to implementing this algorithm is optimal, but I’m finding it a useful learning exercise. At this point I’m also finding debugging the code in Eclipse C++ indispensable – I don’t think a this point I could debug the code without an IDE (or to try would be difficult and error prone). Once you’ve walked through the steps to crosscompile in Eclipse C++ you can use the same setup to remove debug in Eclipse C++ too, with the executable running remote on the Raspberry Pi.

So far I have the outer and inner loops working, so can iterate through the values, and compare to find the smallest value on each iteration. I’ll post another update once I’ve got the swapping done. In the meantime if you’re interested you can take a look at my latest commit in my github repo above.

Uncle Bob: “Make the Magic Go Away” – why you should learn some Assembly

I’ve been spending some spare time learning some ARM Assembly (and sharing from of my experiences here, here and here).

In the early 90s at college I did a module on 68000 Assembly on the Atari ST, but I haven’t done any since. I remember being amazed at how complicated and it was to implement even the most simplest of code, since you’re dealing with a very limited set of instructions, using instructions that the CPU itself understands. At the same time though you gained an insight into what goes on under the covers, how the computer itself works – how the CPU’s registers are used, and how data is transferred from registers to memory and vice versa. It’s computing at it’s most elemental level, you’re working with the bare metal hardware.

Since I’ve also recently been playing around with random stuff on the Raspberry Pi, I thought I’d take a look at the ARM CPU and learn some ARM Assembly. I felt a need to get back to basics and learn about the architecture of ARM CPUs and what makes them tick. As much as this sounds pretty hardcode and crazy, ARM CPUs are showing up pretty much everywhere and you probably don’t even know it. There’s a good chance at least one if not more of you mobile devices you currently and have owned over the past few years has been powered by an ARM CPU. So given the memory and CPU contraints of small form factor devices, and also IoT type devices, it’s not completely off the wall to be interested in learning some ARM Assembly.

Anyway, back to my original point. If you want to understand what makes a computer tick (literally), you can’t go far wrong by learning some Assembly. You’ll get a far better understanding of what goes on under the covers, and a new appreciation of just how much abstraction there is in today’s high level languages (Java, C#, Objective-C etc) – how much they do for you without you even really have to know what’s going on under the covers. But if you really want to get a deeper understanding, you lift the hood/bonnet and start poking around in the engine, right?

It surprised me when I came across this post by Uncle Bob recently:

http://blog.8thlight.com/uncle-bob/2015/08/06/let-the-magic-die.html

Bob comments on the continual search within the industry to find the perfect language or library. We’re continually re-inventing languages and frameworks, but there’s really nothing revolutionary different being ‘invented’ – they’re all solving the same problems, and not really offering anything new. Bob even goes as far to say there really hasn’t been anything new in computer languages for 30 years.

The unusual thing is that we seem to get caught up in the promise that maybe the next big language or framework ‘solves all problems’ and does it better than all other languages and frameworks before, but there’s still really nothing new.

Bob’s point:

But there is no magic. There are just ones and zeros being manipulated at extraordinary speeds by an absurdly simple machine. And that machine needs discrete and detailed instructions; that we are obliged to write for it.

And continues:

I think people should learn an assembly language as early as possible. I don’t expect them to use that assembler for very long because working in assembly language is slow and painful (and joyous!). My goal in advocating that everyone learn such a language is to make sure that the magic is destroyed.

And here’s it is, the reason why you should learn Assembler:

If you’ve never worked in machine language, it’s almost impossible for you to really understand what’s going on. If you program in Java, or C#, or C++, or even C, there is magic. But after you have written some machine language, the magic goes away. You realize that you could write a C compiler in machine language. You realize that you could write a JVM, a C++ compiler, a Ruby interpreter. It would take a bit of time and effort. But you could do it. The magic is gone.

I don’t know exactly what prompted me recently to start learning Assembler, but these comments from Uncle Bob resonated with me. If you don’t know how a computer works, how do you expect to understand what is going on when you develop code to run on it?

So there you go. Bob said it. Go learn Assembler. Maybe you’ll learn something.

 

Oracle: Google has ‘destroyed’ the future of Java on mobile devices

As a long time Java developer (since 1996) and advocate of the language and platform, the legal action from Oracle against Google and Android deeply saddens me. If anything, what Google has achieved is nothing but incredible and outstanding, as they have turned an arguably Java based/influenced platform into the most successful mobile device platform by far, something which Sun and now Oracle were never able to achieve.

Instead of crying over their lost opportunity, Oracle should be doing everything possible to partner with Google and license Android and/or adopt it as the mobile device platform for Java.

The joke that is Java ME needs to be ditched. It’s had it’s time. It was on almost all (what are now called) feature phones sold years back, but no-one apart from (some) Java developers knew this, so now even that potential success is nothing but a lost opportunity.

Please Oracle, do yourself a favor, preserve what little respect you have left from your loyal Java developers: if there’s anything being destroyed here it is our faith in you as a Company and as the guardian of Java.

Ditch Java ME, and license Android from Google as the new Java ME.

Android is what Java ME should have been from day one.