I've been building my own computers since 2001. My very first computer was an Epson Equity II+ (an AMD 80286 based CPU running at 12Mhz). I've used AMD products in many computers, especially when the Athlons rocked the Pentium 4s in both speed and price. However, the last several years my willingness to use AMD has waned. Specifically, I had an old Intel Core2 Quad (Q9550) and I built a new, shiny (at the time) AMD FX8120 based computer. I did some benchmarking and I found that my new FX8120 based computer wasn't really any faster than my Core2 Quad! I was sorely disappointed. Power consumption for the AMD is greater than the Intel parts.
I recently picked up several Haswell based computers/motherboards/parts. I have a Chromebook that uses a Haswell based CPU. I can get about 8 to 10 hours on my Chromebook before recharging. Performance is surprisingly snappy for what it is. I have a Plex media server that runs an Intel Celeron J1900 processor. It is generally fast enough to transcode, yet it doesn't eat a lot of electricity. I picked up a G3258 and i7-4790k CPU and built one new computer and "upgraded" the other one. Same story. Lots of power processing power. Low power consumption. The i7 with an Nvidia based video card uses about half the power as my Q9550 part with an AMD 7770 video card. Amazing.
I pay a bit of a premium for the Intel parts, but the performance is superior. I've gone back to using Nvidia after many years of using ATI/AMD Radeon parts. Why? Because more software using the Nvidia CUDA cores than the AMD equivalent.
Sorry AMD... Maybe if you can build something that can knock the socks off my 4790k, then I'd come back. Until then, see you later!