Okay, Linux? Sept 30, 2016 7:14:50 GMT
Post by Trog on Sept 30, 2016 7:14:50 GMT
I’ve been through quite a journey, these past few months.
It started with me deciding to get up to date on what has been happening with graphics processing these days and to simultaneously expand my rudimentary OpenGL knowledge. And learning in the process that OpenGL moved on to version 3 and that it is related to prior versions of OpenGL in name only.
Then I got involved in a discussion about Bitcoins, and I discovered that earlier ways of Bitcoin-mining were implemented on Graphics processors. Graphics processors?! How does that figure?
Turns out that the power and indeed the architecture underlying OpenGL 3 is the direct consequence of modern Graphics processors being massively parallel computing devices, and that you can utilise them as such for any parallel computing task - their being used for graphics being incidental and that the term ‘Graphics’ is almost a misnomer.
So I tried to get up to speed with utilising Graphics processors as massively parallel computers, and fortuitously for me the industry standard for doing so is a variant of C called OpenCL, which is a play on OpenGL, of course. So it all tied nicely together.
Anyway, on my little laptop with its mediocre Intel graphics processor, I managed to multiply two million-element arrays with each other in less than a second, which works out to about 2 GFLOPS, making it more than 12 times faster than a Cray 1 computer.
Imagine 12 Cray 1 liquid cooled computers, at 9 million dollars each (1980’s), weighing a total of 60 tonnes and consuming 1.9 MWatts of power, being clobbered by my laptop.
As I said, the Intel graphics processor is not considered to be remarkable at all, and the high-end graphics processors used by gamers (e.g. NVIDIA GeForce TITAN X) outperforms it by a factor of thousands. Seymour Cray must be turning in his grave. 12000 Cray 1’s?! For a few dollars?
In this bringing-myself-up-to-date-with-technology exercise, I also stumbled across IoT, the Internet of Things. (Where the xxxx have I BEEN, these past 10 years?) The idea of IoT is that every device could be reachable from the internet – you can sit in London, logon to a website and switch on your bedside lamp in Mosselbaai, for instance. Or feed the cat.
And so I stumbled across the Intel Edison, and promptly bought myself one for about R700. It is basically a two-core Intel Pentium-based computer, with an additional microprocessor thrown in to handle IO, and 5 GBytes of ram, on something the size of 2 SIM cards next to each other. Oh, yes, and it has built in Bluetooth and WiFi, and is therefore inherently internet capable.
It is probably the ultimate IoT development component available at the moment. You can make something to stick in and carry around in your nose, if you want to. We can give Malema a brain. (I once worked as a process engineer in probably the highest-tech manufacturing plant in South Africa – 20 years ago. I think we could probably run the entire plant from a single Intel Edison, today).
Which brings me to Linux. The Intel Edison being a full-on computer, it needs some sort of operating system, and unfortunately it uses Linux. Unfortunate for me, that is, since I’ve always avoided it like the plague, so now I need to learn THAT as well. My first impressions are not good, for this particular environment, at any rate, since one would naturally want real-time capability on something the Edison is supposed to be, and Linux seems to struggle in a real-time environment. Interrupts are handled through polling, for instance, and the result treated as files?! Anyway, very, very early days, yet, and for once I seem to have found myself exploring a leading edge technology, again. I am VERY EXITED! And approaching it in my usual compulsive/obsessive way.