|
Post by Trog on Sept 30, 2016 7:14:50 GMT
I’ve been through quite a journey, these past few months. It started with me deciding to get up to date on what has been happening with graphics processing these days and to simultaneously expand my rudimentary OpenGL knowledge. And learning in the process that OpenGL moved on to version 3 and that it is related to prior versions of OpenGL in name only. Then I got involved in a discussion about Bitcoins, and I discovered that earlier ways of Bitcoin-mining were implemented on Graphics processors. Graphics processors?! How does that figure? Turns out that the power and indeed the architecture underlying OpenGL 3 is the direct consequence of modern Graphics processors being massively parallel computing devices, and that you can utilise them as such for any parallel computing task - their being used for graphics being incidental and that the term ‘Graphics’ is almost a misnomer. So I tried to get up to speed with utilising Graphics processors as massively parallel computers, and fortuitously for me the industry standard for doing so is a variant of C called OpenCL, which is a play on OpenGL, of course. So it all tied nicely together. Anyway, on my little laptop with its mediocre Intel graphics processor, I managed to multiply two million-element arrays with each other in less than a second, which works out to about 2 GFLOPS, making it more than 12 times faster than a Cray 1 computer. Imagine 12 Cray 1 liquid cooled computers, at 9 million dollars each (1980’s), weighing a total of 60 tonnes and consuming 1.9 MWatts of power, being clobbered by my laptop. As I said, the Intel graphics processor is not considered to be remarkable at all, and the high-end graphics processors used by gamers (e.g. NVIDIA GeForce TITAN X) outperforms it by a factor of thousands. Seymour Cray must be turning in his grave. 12000 Cray 1’s?! For a few dollars? In this bringing-myself-up-to-date-with-technology exercise, I also stumbled across IoT, the Internet of Things. (Where the xxxx have I BEEN, these past 10 years?) The idea of IoT is that every device could be reachable from the internet – you can sit in London, logon to a website and switch on your bedside lamp in Mosselbaai, for instance. Or feed the cat. And so I stumbled across the Intel Edison, and promptly bought myself one for about R700. It is basically a two-core Intel Pentium-based computer, with an additional microprocessor thrown in to handle IO, and 5 GBytes of ram, on something the size of 2 SIM cards next to each other. Oh, yes, and it has built in Bluetooth and WiFi, and is therefore inherently internet capable. Intel EdisonSparkFun
It is probably the ultimate IoT development component available at the moment. You can make something to stick in and carry around in your nose, if you want to. We can give Malema a brain. (I once worked as a process engineer in probably the highest-tech manufacturing plant in South Africa – 20 years ago. I think we could probably run the entire plant from a single Intel Edison, today). Which brings me to Linux. The Intel Edison being a full-on computer, it needs some sort of operating system, and unfortunately it uses Linux. Unfortunate for me, that is, since I’ve always avoided it like the plague, so now I need to learn THAT as well. My first impressions are not good, for this particular environment, at any rate, since one would naturally want real-time capability on something the Edison is supposed to be, and Linux seems to struggle in a real-time environment. Interrupts are handled through polling, for instance, and the result treated as files?! Anyway, very, very early days, yet, and for once I seem to have found myself exploring a leading edge technology, again. I am VERY EXITED! And approaching it in my usual compulsive/obsessive way.
|
|
|
Post by cjm on Oct 1, 2016 16:58:02 GMT
Respect!
I only have a some general experience of Linux in the form of Ubuntu and cannot offer much help! There are how many versions of Linux (eg Red Hat, SUSE) which I have never tried. Since about release 4 (when I my own journey commenced) Ubuntu has been much improved and simplified. Initially GUI’s were few and far between. Presumably the software can be run on any of these systems. It also seems that Linux dominates the world of servers.
My interest started when viruses messed up my Windows 98 and I also had to decide whether to move to XP. Since Linux was free and reputedly offered resistance against viruses, I tried it. For years since, I have not either used a Firewall or anti-virus programs. It does not have the commercial glibness of Windows but on the other hand I now prefer Linux. Windows makes me feel as if I am peering through a very small window at my system. It also feels cumbersome and the ads drive me crazy, particularly as I do not have much bandwidth.
For my simple needs the available software is adequate, although it sometimes takes an effort to get programs working. The point is, that if I can use it, most people can and the reputation of Linux being only a geek pursuit, is not accurate.
I noted with interest your complaint about the relevant output being rendered as files. Although it does not excuse inefficiency, every beginner’s manual points out that in Linux (a variant of Unix), everything is a file. I therefore assume that that it is a design feature. The Kernel assumes an important role in the system and it is very often revised and updated.
Whether any of these profound statements is relevant in the least to your OS, I have no idea!
Welcome to the world of Linux!
|
|
|
Post by cjm on Oct 1, 2016 17:10:51 GMT
Rereading your post I started wondering about Arduino - something which I have encountered before in a personal quest to build an alarm which can send a message to a cell phone. There is indeed a connection - I discovered from your link.
Intel Released an Arduino Uno compatible board (with only 4 PWM pins instead of 6) that accepts the Intel Edison module.
|
|
|
Post by Trog on Oct 2, 2016 9:29:08 GMT
When I discovered the IoT and the Edison, I looked very carefully at the Arduino option. I decided against it, because: 1. It is actually a waste to use the Edison in an Arduino. You lose about 90% of the power of the Edison, which is going to sit there and do mostly nothing. 2. If you do prototype with it, you will be prototyping with the Arduino, not the Edison. I prefer to prototype with the Edison as such, so that I will eventually be able to transfer the prototype onto a custom designed printed circuit board with the Edison as a component. This should be much, much smaller than doing the equivalent with the Arduino. Or said another way, for anything you do with the Arduino:- should you eventually build a custom circuit board for it, it will always be something added on to the Arduino, it will never be an integrated system with the Arduino as a component. 3. The Edison already has everything I want, and the Arduino adds nothing to it. The downside is that you have to do everything yourself, and that the Intel works on 1.8V logic, whereas almost every electronic component you can currently attach to it works at either 3.3V or 5V logic levels, and that you must do the logic transformation, something the Arduino already does for you. I think the Arduino is a great concept - I'm just going in another direction. But to get an Arduino to SMS you when an alarm is triggered is almost trivial. FONA 808 shield - mini cellular GSM + GPS for ArduinoYou would need to do some programming, though, in a development environment you have already installed on your computer. As for Linux in a real-time system: With the Edison I guess the way to go is to have the Quark microcontroller handle the realtime stuff (interrupts) and to use the Atom processor (the one running Linux) to communicate with the real world (aka the internet). The Quark has its own realtime OS and is independent of Linux. (Intel has only recently made development tools available to access the Quark, so that right now there is almost no documentation available for it). (I'm talking of designing circuits and producing custom printed circuit boards as if I've been doing it all my life. Actually, I'm learning this as I progress with this exercise - never done it before. One can reflect on the fact that only a few years ago, before the current capabilities of the internet, this would be impossible).
|
|
|
Post by Trog on Jun 6, 2017 9:05:52 GMT
Intel Edison as an Web ServerThe Edison is, of course, an IoT device. The problem with IoTs generally is, how do you get the Thing onto the Internet? Generally, at some stage the Thing can be associated with some IP address, but that will almost never be a static address on The Internet. Static internet addresses cost a fortune to acquire and to maintain, and are probably too valuable to assign to a Thing exclusively anyway. Therefore, to make a Thing addressable from The Internet, all kinds of shenanigans involving both web and DNS servers must be performed. Anyway - the link above is to my Intel Edison as exposed to the web. The Edison is WiFi capable. I've configured the Edison to connect to my VodaCom WiFi modem on startup. I've made use of a commercial relay service, Yaler, to expose the Edison to the Internet. The Intel Edison runs a version of Linux, called Yocto Linux. On the Edison, I'm running a Python Web Server. Lo - you can connect to my Edison with your internet Browser. On the web page, I have a photo of the Intel Edison next to my mouse. Of course, I can put anything on that web page - i.e. measurements of sensors connected to the Edison, buttons to switch relays, controls to drive stepper motors, or the video-stream from an USB camera. The Edison itself is actually the stamp like thingamy with 'Edison' printed on it - it is mounted on a breakout board for infrastructure support and to expose the IO pins. There is a micro-USB connected to it, to supply power only. (The link to my Edison will probably only be active for a short time, and perhaps even during that time be sporadically unavailable as well, as you can imagine. I'm using my equipment for other things, and this setup is of a very temporary nature).
|
|
|
Post by cjm on Jun 6, 2017 17:53:26 GMT
Gees.....and I cannot even get my phone to connect to my pc
|
|
|
Post by Trog on Jun 7, 2017 9:27:22 GMT
Gees.....and I cannot even get my phone to connect to my pc Could you actually see my Edison? I found the setup to be rather fragile - a bit of a hit-or-miss affair. The reasons, I think, is that in the first place a standard Vodacom Client 3G connection is not ideal for servicing a web server - mostly browsers just decide that the thing is too slow, and time-out. Secondly, I'm sure that the relaying service Yaler slow things down even more. I guess that currently, although IoT is supposed to give easy browser access to devices, that access is supposed to be limited to browsers and devices on the same local area networks, not the WWW - in which case it works like a charm. As for myself, the above are really problems associated with the Rapid Development crowd - those people who develop in Integrated Development Environments in .Net, Java and Visual Basic. I call them them Bread And Butter programmers. I'm really looking at developing a mechanism where a Laptop/Computer on the WWW can open a direct connection to a Berckley TCP socket server on the Edison and communicate with (provide a visual interface for) the Edison directly in compressed binary, bypassing any relay server, Linux inefficiencies, web servers running on the Edison, HTTP/XML protocol, browser attempts at interpretation, etc. For that, even communicating through Vodacom 3G will be plenty fast enough.
|
|
|
Post by cjm on Jun 7, 2017 11:48:57 GMT
Gees.....and I cannot even get my phone to connect to my pc Could you actually see my Edison? I found the setup to be rather fragile - a bit of a hit-or-miss affair. The reasons, I think, is that in the first place a standard Vodacom Client 3G connection is not ideal for servicing a web server - mostly browsers just decide that the thing is too slow, and time-out. Secondly, I'm sure that the relaying service Yaler slow things down even more. I guess that currently, although IoT is supposed to give easy browser access to devices, that access is supposed to be limited to browsers and devices on the same local area networks, not the WWW - in which case it works like a charm. As for myself, the above are really problems associated with the Rapid Development crowd - those people who develop in Integrated Development Environments in .Net, Java and Visual Basic. I call them them Bread And Butter programmers. I'm really looking at developing a mechanism where a Laptop/Computer on the WWW can open a direct connection to a Berckley TCP socket server on the Edison and communicate with (provide a visual interface for) the Edison directly in compressed binary, bypassing any relay server, Linux inefficiencies, web servers running on the Edison, HTTP/XML protocol, browser attempts at interpretation, etc. For that, even communicating through Vodacom 3G will be plenty fast enough. I had no problem accessing the website. The photo took a while to appear.
|
|
|
Post by Trog on Jun 7, 2017 13:58:31 GMT
I had no problem accessing the website. The photo took a while to appear. Thanks.
|
|
|
Post by Trog on Jun 30, 2017 8:49:31 GMT
Intel announced last week that they are discontinuing the Edison. HackadayThey must've cost the world a few billion dollars worth of wasted effort for those who thought of developing for it. (In contrast, the 8080 processor evolved over the years into the Intel super-processors of today, which is still compatible with it - that is almost 50 years). If you can't trust Intel to maintain a product line, who can you trust? Maybe Samsung: This is what I'll be moving to. In some ways, this is even more powerful than the Edison anyway. Only thing - I've been programming Intel chips in assembler for years, that's why I would've wanted to stay with the Edison. ARTIK 520 MouserThe Edison is/was a brilliant product, dammit! That's probably what happens when a company's management gets taken over by accountants, rather than engineers.
|
|
|
Post by cjm on Jun 30, 2017 17:33:05 GMT
Intel announced last week that they are discontinuing the Edison. HackadayThey must've cost the world a few billion dollars worth of wasted effort for those who thought of developing for it. (In contrast, the 8080 processor evolved over the years into the Intel super-processors of today, which is still compatible with it - that is almost 50 years). If you can't trust Intel to maintain a product line, who can you trust? Maybe Samsung: This is what I'll be moving to. In some ways, this is even more powerful than the Edison anyway. Only thing - I've been programming Intel chips in assembler for years, that's why I would've wanted to stay with the Edison. ARTIK 520 MouserThe Edison is/was a brilliant product, dammit! That's probably what happens when a company's management gets taken over by accountants, rather than engineers. Interesting to reflect on the following: hackaday.com/2017/06/19/intel-discontinues-joule-galileo-and-edison-product-lines/Many years ago I tried my hand at assembly language to make an editor program run faster (on a Commodore 128 of all things). Fortunately the increase in processing speed of computers rendered further efforts on my part unnecessary. In due course my need for a special editor was overtaken by other interests. So my project was never completed.
|
|
|
Post by cjm on Jul 5, 2017 6:38:57 GMT
|
|
|
Post by Trog on Jul 5, 2017 8:12:57 GMT
Yes, the Raspberry is nice for hobbyists and tinkerers. Many years ago, I would've liked to play with one as well. But for me it actually has too much. The Intel Edison was just a single chip with WiFi and Bluetooth capability, and the breakout board merely exposed its pins. So, one could use it to prototype systems eventually consisting of only the Edison on your own custom PCB, driving intelligence and devices of your choosing. In that respect, I think that the Samsung ARTIK 520 comes closer to the Edison than the Rasberry. Besides, I find the Raspberry a bit Micky Mousish. (But if you're going to use a Raspberry to build a once-off system around, it could be a great solution).
|
|
|
Post by cjm on Jul 7, 2017 6:26:47 GMT
|
|
|
Post by Trog on Jul 7, 2017 6:51:57 GMT
Interesting! I really do need to keep tabs on what's happening in South Africa as well. I've actually been inside that building, a few years ago. That was before it acquired the IoT appendage. At that stage, we used Vodacom to host our servers.
|
|