I was once debugging some issue with my laptop's multimedia buttons driver. I found some tweak which helped: enable the tweak, reboot - works fine, disable tweak, reboot - no go. Tested repeatedly several times.
I filed a bug report and received an answer that my tweak absolutely cannot work. The tweak was disabled at the time and the button didn't work. I pressed it again after reading this email and boom - it works.
I swear.
I figure my mind must have been believing so much in futility of pressing this button that it didn't even bother pushing it all the way down. Scary stuff.
Troll. But just in case you're not permanently damaged yet:
Imagine you're invited to a party where you are the oldest, fattest, least appealing person there, and you're not particularly made to feel welcome.
Now imagine you're expected to attend dozens of these parties in the course of your employment.
Now imagine that these parties are intentionally arranged this way, and that the pretty people are paid to be there for the express purpose of making you look and feel old, fat, shabby, and unwelcome.
You're invited to a conference that's relevant to your position and your industry.
To a party relevant to your industry.
When you arrive, a number of hired twentysomething men are milling about wearing tight tee-shirts, padded crotches, and skintight jeans. They're dancing, drinking, flirting with you and your coworkers, and subtly drawing attention towards their ample bulges.
I'm pretty sure the dancers were not even allowed to drink and they didn't flirt with female participants, if they flirted with anybody at all.
Some people in the crowd are really getting into this, having their pictures taken with them. The room starts getting sweaty, the music amps up, and the paid gogo boys start thrusting their cocks in rhythm with the music.
And I guess at this point I'm supposed to imagine getting drunk, gang raped by gays and whatnot.
This "analogy" was a severe exaggeration of what really happened.
This is why I love this particular religious war so much. No matter how absurd it gets, someone can always come along and post something even dumber and more deliberately inflammatory, and without any self awareness at all.
I have no problem with people using 4 spaces for indentation.
However, damned shall be those who think that tabs are anything but 8 spaces wide and actually use them this way in their code. Tabs really are 8 spaces wide and every terminal will tell you so.
Real world example: just yesterday I've been bitten by Eclipse defaulting to use "4 space wide tabs" for indentation in otherwise space-indented codebase. Everything looked nice and cool until I ran git diff. How could Eclipse developers think that it was a good idea?
Eclipse is very goofy about this, and it's something I loathed about using it (having recently switched to IntelliJ, which is better but not by an order of magnitude). In my experience, it will sometimes indent according to your settings, spaces or tabs, but at other times, based on some other setting that is in a different place than the first, it will indent based on the line previous to the one you are indenting. This results in a kind of slow creep of badly formatted code entering the codebase, if you have a lot of developers who don't care or don't know to check whitespace upon commit, and don't explicitly share a well-customized settings file.
At least IntelliJ has Emacs-style DWIM tab, but still has a whole host of problems with its better than Eclipse but still subpar editor.
Now, some people will claim that having 8-character indentations makes
the code move too far to the right, and makes it hard to read on a
80-character terminal screen. The answer to that is that if you need
more than 3 levels of indentation, you're screwed anyway, and should fix
your program.
I do agree that there comes a point where you can get over zealous with indentation, but 3 levels deep is pretty common. In fact you idle at one level of indentation just with having your code inside a function. So basically all you need is one conditional inside a loop and you're already maxed out.
AMD AM2/AM3 CPUs support ECC. It will work provided that the motherboard has the extra wiring required and BIOS support is enabled.
Some vendors (notably ASUS) advertise ECC support on many cheap AMD motherboards.
AFAIK to get ECC with Intel you have to pay for the Xeon.
Without the common mathematical language necessary to reason and talk about computation in the general case, and Babbage-linage programmers would have been at a severe disadvantage.
Not sure about that.
The Analytical Engine was capable of arithmetics and conditional branching. Code and input data were to be read from punched cards.
Such machine was quite practical and if Babbage managed to build it, it would immediately get used for some number crunching - scientists and businesses would love a calculator which can autonomously eat streams of data and perform boring computations on them.
Babbage and Lovelace were aware that AE can be used to process numerically encoded non-numeric data, so it would also find uses in text processing. Sooner or later somebody would invent Fortran and COBOL (after all, early compilers were pretty much string rewriters) and we would be headed to land in same place we actually did.
Only CS PhDs would waste their time on attempts to algorithmically solve the halting problem instead of "wasting" it on the P=NP debate.
The machine would have been capable (it would have been Turing complete after all) and they had some ideas of what it could be used for, but I think you are forgetting that you are standing on the shoulders of giants. Those hypothetical programmers would have lacked any common mathematical language with which to talk about computation and thus, lacked an abstract model of computation entirely. They would have been able to program particular machines to some degree but, unless someone was spurred on to replicate the work that Church/Turing/etc decades earlier (in fact, their work would undoubtedly be pre-performed out of necessity), they would lack a solid abstract model of what computation is and what is necessary for it. Until that theoretical work would be done, all of those programmers would just be taking wild stabs in the dark with ad hoc methods and superstition spawning intuition. Engineers without Newtonian physics, no common language beyond punchcards and common english. They could build great bridges sure, but lacking any semblance of a formal notion of computation they would be crippled compared to what they could be. They would be at a distinct disadvantage.
Hell, half the reason they couldn't build the thing is because they lacked the theoretical background that would have put it within their grasp. With no switching theory, without the insight of Shannon, the machine was to be an unwieldy system of gears. It is hard for the modern mind to fully internalize just how much framework they were lacking.
The extent to which the standard HN "formal CS educations are worthless" battle cry is true is only the extent to which we benefit from those who came before us.
UTM is a dumb and completely impractical model of computation. Thanks to its simplicity it is a handy tool in (un)decidability proofs and some very general computational complexity reasonings.
And that's all. Numeric computations happened on real machines. Graph processing happened on real machines. Text processing happened on real machines. Structured programming, procedures, programming languages, compilers - all happened on real computers.
Using (or even thinking about using) the UTM for any practical application would be a huge PITA and nobody ever does it. We only know that it's "possible" and hence if we want to prove something general about all possible computation, it suffices to do some magic with the UTM.
I don't think you are getting it. In Babbage's time they were not merely missing the UTM. They were missing damn near all of mathematical logic (even set theory only really came about around the time that Babbage died) and meta-mathematics was incredibly immature at the time. Forget answering questions about CS; they could not yet ask the questions.
They could have pulled some impressive stuff off I am sure, but it would have all been intuition and stabbing in the dark. The tools necessary to structurally reason about algorithms had not yet been created, nor even the tools necessary to create those tools...
Ada had the notion that the Analytical Engine was something special, something more than just a calculator... but that was conjecture based on genius insight. To actually discuss that idea in a rigorous manner would require several more decades of advances in mathematics.
Could they have programmed? Yes, obviously. That's not even hypothetical, since they did. Would they have been at a distinct disadvantage? Without anything reasonably resembling modern mathematics, absolutely.
You don't need logic, set theory, Turing machines or any meta-mathematics to build practical software.
When I was ten, I had no idea about any of this stuff and yet when somebody showed me how to do arithmetic, variable assignments, comparisons and goto in QBasic (pretty much equivalent of Babbage's machine) I was able to write a simple drawing program and tic-tac-toe which checked whether one of the players won.
Add some IO and I would write a program which reads series of transactions and computes your bank account balance. Tell me what a matrix is and I would implement LAPACK for you.
You absolutely need those things for a great deal of modern programming. All modern programming? No, but a great deal.
Without that they would have been at a disadvantage. I don't see what is so hard about this concept to you.
Regardless, the simple historic fact remains that Babbage and Ada were both unable to build the machine, and unable to verify their suspicion that the machine was special, and unable to effectively communicate to their peers this suspicion. The prerequisite math for all three of these tasks did not yet exist.
I was once debugging some issue with my laptop's multimedia buttons driver. I found some tweak which helped: enable the tweak, reboot - works fine, disable tweak, reboot - no go. Tested repeatedly several times.
I filed a bug report and received an answer that my tweak absolutely cannot work. The tweak was disabled at the time and the button didn't work. I pressed it again after reading this email and boom - it works.
I swear.
I figure my mind must have been believing so much in futility of pressing this button that it didn't even bother pushing it all the way down. Scary stuff.