The fact you can, with great effort, in C++ define a Vector type which checks array bounds, and you can with more-than-usual effort then use it, is WORTHLESS, because just one co-author, just once, in your entire zillion-line 500-file program, uses the obvious default array mechanism, and game over. (JA has drunk the kool-aid to such a great extent, he does not even see the stupidity anymore, & actually thinks this is the way it ought to be.) News flash: safety should not be "achievable with effort." It should be "the default." Unsafety should be what requires excess effort. The fact that so many people, including JA himself, are trying so so so hard for so so long -- years and years of their life expended -- to try to paper over such flaws in the C++ language, is not a sign that it is a great language. It is a sign that it was done wrong. Let us admit the obvious. And what I find even more incredibly astounding, is that he himself does not find this obvious. And yes, a <---> operator for swapping. RW Floyd suggested this in his Turing lecture about how bad languages were. Then the language guys proceeded to ignore him like usual, until we have now reached the point where people like JA actually think this must have been a joke. But I am serious. Commonly used important constructs, should be compact. It is a very simple principle. Language like PASCAL decided to write BEGIN and END everywhere that C just used { }. Very simple and obvious violation of very simple principle. Why? For what possible reason? I mean, were they just sadists? Because I can see absolutely no other reason for it.
How can a language reasonably include "machine language stuff"??? A (the?) point of C was to get CPU _independent_!
--Arggh. The point is: machines do things, and/or should. Some HLLs then try to prevent you from doing that. This is stupid. Correct move: Essentially every operation, which any current or future machine does or likely will/should provide in hardware, should be accessible in the HLL. If your machine does not have it, then that is what HLLs are for -- to emulate it via compilation to a suitable code fragment! If they do not, they they are handicapping not helping you, and they are handicapping not helping future progress because then the hardware designers will see no incentive to actually provide said hardware since "nobody uses it." Does any HLL take this correct attitude? Or does every single HLL take the wrong attitude? This is a huge disaster, and those who've drunk the kool-aid actually think this is the way it ought to be, they've become so wedged. It is so insane. On the bright side, I totally agree with some of the stuff math-funners have said, like Rokicki re SIMD, parallel, and memory/threads issues. Re some other stuff, like C++ templates, in fact my compile-time ideas would do that better. For example, suppose you wrote a SORT routine for int32's. You now want same sort routine but for int64's, int8's etc. OK just do #for( T in {int8, int32, int64} ){ SORT routine defined here using type T } and voila. Here #for is a for loop executed at compile time. You want a compile time language very much like the actual language albeit restricted in power (do not allow it to be turing undecidable whether it compiles... incidentally C++ fails that test and hence should have been dismissed right there; it actually is Turing undecidable whether your C++ program is valid C++, which is just a mind-boggling level of language-design incompetence and incredible testament to how crufty it is).