If you make electronics, the odds of frying a component or electrocuting yourself is 100%, regardless of how educated or talented you are. The following happen to me on a regular basis, whether I'm working on high speed digital/RF or making something simple like an arduino shield, even though I learned as an apprentice under a brilliant EE:
* PCB with poorly aligned copper layers shorting the second the prototype is plugged in, usually destroying at least one chip. Lots of smoke and burning FR4
* Solder in $100+ high power transistors the wrong way. BOOM
* Using a counterfeit capacitor from shady vendor that either shorts internally or just plain explodes. Happens a lot when I need a really large capacitor and have to get it on short notice
* Forget to use a little extra flux and tin 'whiskers' form between freshly soldered pins that short them the second the device is powered up. This is so common that NASA has a whole website dedicated to the topic [1]
* Use wrong temperature profile or make the pin layout a thousandth of an inch too small or large and bam, two solder balls on a BGA flow together, requiring hours for reflow and reballing if you're lucky, and a new $2,000 FPGA if you're not.
Any nontrivial circuit is going to be impractical to simulate (and impossible to describe analytically as a whole) for all but the most well funded projects so I'd say 90+% of EE is trial and error, even for the most experienced designers. There's many rules of thumb and you develop an intuition for a wide variety of situations just like you do in programming, but it's just a fundamentally different field with different constraints.
Wow. That's some crazy stuff. So, the scary, trial-and-error is unavoidable then. Thanks for the feedback. Btw, I just recommended a few books here based on feedback from other EE's...
Any thoughts on them? Particularly, a combo of something like Malvino and Circuit Designer's companion to get a good head start on analog and PCB's respectively. Or do you have other references that kick ass in teaching practice more than theory? Gotta build up links for new people to accelerate hands-on part of their learning just like others did for programming.
Note: Art of Electronics is usually in my list but that link was for digital learner. Not sure if it's needed there.
What sort of circuits do you want to design? RF? Audio? Analogue is a big field! It's like saying "I want to write programs, which books do you recommend?"
I'm gathering information to help everyone out then organizing, cataloging, and sharing it. You could say it falls into some basic categories:
1. Enough knowledge to get designs working on a FPGA plus integrate that with other chips on a PCB. OSS HW with minimal analog.
2. Enough knowledge to design basic analog circuits for control and stuff. Alternatively, to design digital cell libraries as there's almost nothing available for academic toolbuilders.
3. The serious, mixed-signal shit that lets me do some parts in digital and some parts in analog where it handles it better. I've seen analog coprocessors with 100x performance at 1/8th power on ODE's and stuff. It also seems like certain signal processing or media codec tasks would be crazy fast/efficient in analog. I know high-end ASIC's make extensive use of such techniques. What tidbits I see in blog comments and papers can only be described as black magic without a more thorough resource. :)
4. RF books outside of ARRL that's been recommended to me. Need a lot of people experimenting with this stuff to reinvent things like TEMPEST that are classified. They need some good resources to get head-start.
So, those are some basic categories where I'm looking for both accessible, foundational material and cookbooks with heuristics. Being able to combine COTS components like MCU's and FPGA's on custom PCB's is major help to hobbyists. Being able to make the cells and basic, analog components required in about any ASIC in conjunction with tools like Qflow OSS Synthesis could get custom stuff going quicker. More thorough stuff for mixed-signal for its advantages plus to explore analog and digital interactions in digital systems that can screw either up. And RF for reasons stated.
Whatever you have. Drop it here or email it to me in my profile address. I'll keep circulating that along with others tips and resources whenever people ask.
* PCB with poorly aligned copper layers shorting the second the prototype is plugged in, usually destroying at least one chip. Lots of smoke and burning FR4
* Solder in $100+ high power transistors the wrong way. BOOM
* Using a counterfeit capacitor from shady vendor that either shorts internally or just plain explodes. Happens a lot when I need a really large capacitor and have to get it on short notice
* Forget to use a little extra flux and tin 'whiskers' form between freshly soldered pins that short them the second the device is powered up. This is so common that NASA has a whole website dedicated to the topic [1]
* Use wrong temperature profile or make the pin layout a thousandth of an inch too small or large and bam, two solder balls on a BGA flow together, requiring hours for reflow and reballing if you're lucky, and a new $2,000 FPGA if you're not.
Any nontrivial circuit is going to be impractical to simulate (and impossible to describe analytically as a whole) for all but the most well funded projects so I'd say 90+% of EE is trial and error, even for the most experienced designers. There's many rules of thumb and you develop an intuition for a wide variety of situations just like you do in programming, but it's just a fundamentally different field with different constraints.
[1] http://nepp.nasa.gov/whisker/