sajis997 Posted July 19, 2016 Posted July 19, 2016 Hello everyone, I think I better tell a bit about my self and my expectation from this forum. I am interested in algorithm design and programming. I have several years of experience in C++,but yet to program with any embedded systems. I am very keen to get into this paradigm and need your suggestion to guide the process. I have already seen several course works in this site. Could any one provide me with the chronological order of the course works from "DIGILENT" to become a professional embedded developer ? Thanks
D@n Posted July 19, 2016 Posted July 19, 2016 Welcome to the forum! You'll find many people here who love embedded programming. I am one of them. Embedded programming offers some unique, and yet very difficult, challenges over and above traditional programming. For example, in traditional programming, you can debug using printf. Sometimes you'll need a debugger. In embedded programming, printf is often hard to come by--debugging starts with the blinky LED. It's amazing how exciting an LED can be: now it turns on, now it turns off, now it blinks--that means it works! (You get the idea.) Unlike traditional programming, it is a lot harder to figure out where your code is failing in embedded programming--even the debugger can be a challenge to get runningd. (Did the LED turn on?--it might be all you've got!) If you wish to get your feet wet, learn about embedded programming, then start with the "Learn" links above. There's a lot of projects there. Plan on learning how to use every peripheral connected to ... whatever board you choose to start with. If your goal is instead to become a professional embedded developer, then you may need to do more. In that case, I would recommend you look up Bruce Land's lectures on youtube. Just a personal opinion ... (I'm still going through his lectures on the PIC32.) Also: plan on making many mistakes. Yours, Dan
sajis997 Posted July 19, 2016 Author Posted July 19, 2016 Thanks Dan, I have already piled up some resources and wondering to discuss to make sure that I am hanging out the correct reading material along with the suggestions you mentioned. The very first on my list is : 1. An Embedded Software Primer - Some comment about the book would be helpful. Regards sajis997
D@n Posted July 19, 2016 Posted July 19, 2016 Gosh, Sajis, I'm kind of the wrong person to ask about that. I learned embedded program by working with a master at it. Yeah, sure, I took a course or two in college, but any textbook from then is way out of date today. I would offer two pieces of advice, though: If you get a text book, make sure that either a) the textbook is hardware independent (and that is what you want), or b ) that the hardware the textbook teaches from is the hardware you have--or at least hardware you are intending to get. Every embedded platform is different. It'd be a shame to have a book teaching you about the peculiarities of a piece of hardware you do not have. The second piece has nothing to do with the textbook, but was a valuable lesson my own mentor taught me: Do what you can to get your algorithm running on a non-embedded platform before trying to run it on an embedded one. This comes from my first embedded project out of college, where we had multiple computers hooked up with data pipes between them. We built a "simulation" of the hardware, using TCP/IP socket pipes instead of the hardware data pipes, simulated (perhaps even recorded?) data, and then proved that our software and our algorithms worked before we tried placing them onto an embedded machine. I am now a firm believer that this is just good programming practice--although I will permit others to disagree Hope that helps and gets you closer, Dan
hexafraction Posted July 20, 2016 Posted July 20, 2016 17 hours ago, D@n said: The second piece has nothing to do with the textbook, but was a valuable lesson my own mentor taught me: Do what you can to get your algorithm running on a non-embedded platform before trying to run it on an embedded one. This comes from my first embedded project out of college, where we had multiple computers hooked up with data pipes between them. We built a "simulation" of the hardware, using TCP/IP socket pipes instead of the hardware data pipes, simulated (perhaps even recorded?) data, and then proved that our software and our algorithms worked before we tried placing them onto an embedded machine. I am now a firm believer that this is just good programming practice--although I will permit others to disagree Hmm... I've always done hardware simulations using a Verilog simulator (Modelsim PE student edition), with multiple devices just being instantiated in the testbench and hooked together (with additional processing to introduce sources of noise/jitter/error the hardware needs to be able to handle). What advantages did you see with the socket pipes that I may not know about? Did it offer additional debuggability through tools such as Wireshark?
D@n Posted July 20, 2016 Posted July 20, 2016 @hexafraction: Good question. First, understand that I was answering a question about embedded programming. To me, that means C/C++, and not so much VHDL/Verilog. (We can argue that difference later if you would like ... although it's probably little more than chosen terminology.) To the extent that you can write your software in generic C/C++, you shouldn't need a "simulator" per se to run it. In the discussion above, the team I was on used the sockets/pipes to connect multiple embedded programming (C/C++) "chips" together. This allowed us to test all of our software in a hardware independent fashion--using computers that made it easy to test. When we were ready to place the software onto the hardware, we then knew that all of our algorithms worked. This limited potential problems to hardware specific features, such as the actual communications pipes and such. Our simulation remained valuable, though, because we tended to have only a couple of hardware boards and the simulation would run on anyone's desktop computer--regardless of whether they had access to an actual hardware board. I hope I didn't mislead you with my prior answer--there was no Verilog simulator involved in that project. Indeed, there was no Verilog or VHDL involved either--at least not on the part of the project I was working on More recently I've been focused on Verilog, and specifically Verilog within Digilent products. Verilator has been my chosen test bench. I've also made it a personal goal to have all of the hardware interfaces working in a Verilator simulation model before I purchase the hardware--perhaps as a piece of this same lesson learned. This means building simulations for all of the peripheral interfaces and more before purchasing the hardware. As one example, I built a UART simulator that interprets UART logic and converts it to a TCP pipe, and then does the same for the other direction--converting TCP pipe inputs to UART logic levels. As another example, I was working on a Flash simulator that would run within Verilator the other day. In general, this personal goal has helped to keep the pain in my wallet down ... as an example, I was at one time going to purchase the Spartan 3E development board, built drivers for all the peripherals, and then turned around and purchased a Basys-3 instead. Still, I had all the peripherals running before the board arrived--I just saved myself the dollars of not buying a board that wouldn't have met my purpose. Does that clear my answer up any? Dan
hexafraction Posted July 20, 2016 Posted July 20, 2016 5 hours ago, D@n said: @hexafraction: Good question. First, understand that I was answering a question about embedded programming. To me, that means C/C++, and not so much VHDL/Verilog. (We can argue that difference later if you would like ... although it's probably little more than chosen terminology.) To the extent that you can write your software in generic C/C++, you shouldn't need a "simulator" per se to run it. In the discussion above, the team I was on used the sockets/pipes to connect multiple embedded programming (C/C++) "chips" together. This allowed us to test all of our software in a hardware independent fashion--using computers that made it easy to test. When we were ready to place the software onto the hardware, we then knew that all of our algorithms worked. This limited potential problems to hardware specific features, such as the actual communications pipes and such. Our simulation remained valuable, though, because we tended to have only a couple of hardware boards and the simulation would run on anyone's desktop computer--regardless of whether they had access to an actual hardware board. I hope I didn't mislead you with my prior answer--there was no Verilog simulator involved in that project. Indeed, there was no Verilog or VHDL involved either--at least not on the part of the project I was working on More recently I've been focused on Verilog, and specifically Verilog within Digilent products. Verilator has been my chosen test bench. I've also made it a personal goal to have all of the hardware interfaces working in a Verilator simulation model before I purchase the hardware--perhaps as a piece of this same lesson learned. This means building simulations for all of the peripheral interfaces and more before purchasing the hardware. As one example, I built a UART simulator that interprets UART logic and converts it to a TCP pipe, and then does the same for the other direction--converting TCP pipe inputs to UART logic levels. As another example, I was working on a Flash simulator that would run within Verilator the other day. In general, this personal goal has helped to keep the pain in my wallet down ... as an example, I was at one time going to purchase the Spartan 3E development board, built drivers for all the peripherals, and then turned around and purchased a Basys-3 instead. Still, I had all the peripherals running before the board arrived--I just saved myself the dollars of not buying a board that wouldn't have met my purpose. Does that clear my answer up any? Dan Yes, that definitely cleared things up. I think I didn't read thoroughly enough and assumed that HDL-based hardware design was involved. Thanks again!
Recommended Posts
Archived
This topic is now archived and is closed to further replies.