Obligatory Bret Victor:
TL;DR: "horse manure"
I almost forgot this. I was very excited for watching it, but came away a little disappointed.
I really liked the way he said the "we don't know what we are doing.." and so on. Because that is particularly true. Even after you had a brilliant idea, you won't know it, until it is tested. So you may say, the even when you know you don't know. In addition, the lies that rides "our" industry are so hard to break that even universities fails to expose them. In fact noone are more victim to them than the universities themselves. Which is kind of sad.
We are aware of only a few percentages of the code we produce. The rest is a black hole of zeroes and ones, ripe for anyone with a little curiosity to find a place to stick his hook in and break it. Which they do, constantly. This is physics infact. Yet you can't sell that.
I think the talk failed, in some very important areas. Moore's "law" for instance, was mentioned. Which by now is dead. And has been so for at least 10 years. This is now more like a myth, than part of reality. And this is the reason that we see parallel computing trying to make up for it.
(i realize moores law isnt about performance, exactly, but I also tend to think, that to the extent that it is not, it's also largely irrelevant, at least by now).
But parallel computing does not scale, in the way he claims. It fails to scale for the same reason that adding more coders at the end of a project also doesn't scale. For certain tasks, yes. But for an entire program. No. For the future? No. A little bit. Yes. A lot? No.
The cores we run now, are designed to be mostly idle too. And if they aren't kept idle, they will overheat. And will simply not work well. In other words, they are overclocked chips, that must be cooled like constantly, or they will burn. The only thing new about them, is that they internally approaches physical limits to their size. But the bang you get back isn't all that great.
Now, just remember back 15 years. What CPU did you run then? Was it a 500Mhz computer? 15 doublings (today they say it's 11 months) would give you 16 384 000 mhz. (16 Thz) And if we say only 8 doublings, the number would be 128 000 mhz or 128Ghz. (well not quite, but you get my drift).
Today, each core should run at 128 Ghz if moores law included performance. In the next 20 years, it will not get any better, unless there's a significant breakthrough, for instance in Quantum computers. But well, it could come from somewhere else, or as a byproduct of trying to make QC work, or something else. Who knows.
The only thing I know, is that Michio Kaku laughs at Google spending 100s of millions in QC research. Or more specifically, at the current "results". And he is not the only one. Just the fact that he does that, and that other scholars do as well, should tell you something. But I still believe more in people who is doing actually experiments to try to do it, than academia. So here my money is on Google in fact. But not a lot of money.
But it is for SURE not something you can count on. Same as you can't count on Oil lasting for ever.
But if there is a breakthrough, it will not come from parallel computing, by itself. Parallel computing will someday, maybe become part of the future, but it will not happen on sillicon, using these kind of chips. I mean, it already happend, but you need to understand that what we got is a joke, compared to what we need. We need parallel computing, (always will) but on MUCH faster cores.
What you really have, in your PC today, is an overclocked cpu, where the "technology" is the same as it was 15 years ago, only we have become better at the manufactoring process. Better at tweaking. That at most, will give you 4.4ghz ... for a couple of hours a day, if you're lucky. So don't talk to me about "moores law", because it's been dead since about the time of the Jurassic Park. Yes I know about pipelines, caches and so on. It just doesn't make up for it, by a long shot! They too are just a kind of tweaking the technology into performing a little more. For cache, it's a LOT more, and if you disable it, those modern cpu's become as slow as 10-15 years ago. Not that it isn't cute and smart, the caches, and the pipeline. It certainly is. But compared to a *working* moore' law, also in the realm of performance, it's peanuts.
Most of the last decades progress seems to have happened in GPUS, but these are very costly beasts, and as I said somewhere else, they cost about the same today as 4 years ago. In the past 4 years you got a maximum of 4 times more capable, but now have to pay 4 times as much for it. It should be 16 times more capable, and cost about the same. And even my estimates maybe somewhat wrong, they are not significantly wrong. And I am speaking of the range of GPUs that a normal person can expect to buy. If you are ready to fork up 50K or 200 millions, then be my guest. But this pricing says more about the tech than any paper could.
Another thing I reacted to, is his point regarding manipulating memory directly. This is a particularly vague point, when we know that the CPU is mostly idle, when doing this, and that the bottleneck is nowhere near being the CPU, but the memory chips/datalines themselves.
Another reason why this point is very weak, is that we of course, are already doing this. We are writing the data directly. And it is dogslow.
And when it comes to his talk about aliens, and computers negotiating a protocol, well that is what the web is doing, and we all know how amazingly mind blowingly fast that is....
I want to add a few things that may strike you as odd, but that I think is extremely important to grasp. It as important as understanding the process of evolution, and how everything that happens is dependant on that process, and a subset of it, even our "own" very research.
Intelligence, is, more often than not, what we call a long string of trivial(even dumb) steps/data. And as I said above, even when you managed to produce something intelligent, it often needs to be rigorously tested, before you can be sure that it deserves to be called that. Which basically means that intelligence is an oxymoron. Which again also means AGI is an oxymoron.
Now. What intelligence always is, are very specific facts. Not only in space, but also in time. Always details. Details, details details. Details that needs to correlate to other details. Overwhelmingly many for such a poor brain nature has given us. We haven't even got the intelligence, to calculate the curve of a ball in realtime. But have to go to school to learn a simplification of it, created by a famous monkey long time ago. While we are breaking sweat. Well. Some of us are.
Yet, our subconscious does a pretty decent guess, after a little training does it not? To your subconscious it's like nothing. We have a genius inside. But we are in charge of giving it the right training, and access to details. Something to chew over the next 20-30 years or so.
And if we do that, once it has worked a little, it can pour designs out on "paper" faster than you can think. Like a boss! No, not a boss, gawd no. I mean something else.
And you, the socalled "conscious" idiot, then, maybe need months, to years at understanding your own designs, completely. That's the difference, and the potential you are missing out on, if you don't invest time working with details.
So it is no mystery to me, why all of my browsers used to be brought to knees fall in front of me, when I loaded a significantly large text file into them, (40 mega), some months back. And my own texteditor loads and index it faster then you can say booo. That is because I have specifically written it to do it quickly, because I found the reason for it to be slow, and then of course solved for it. And, at the time I learned a new technic for speeding up similar problems later. Even I am an OOP programmer, I would never create an Object for every char of a document would I? Unless I didn't know the first thing of what I was trying to do.
If you do not program at the low level, ever, you maybe never discover things like that. And by "low level ever", I mean considerable time with that. So much so that it becomes second nature, and your prefered "language". And believe me. I have nothing against high level. I program my apps always to a highlevel, and I prefer much rather to say "Loadbitmap" or even "LoadPicture", then to have to do the lowlevel, if it can be avoided. Because I want things done.
But if they are slow, then they need to be rewritten. And for that, assembly is the best way. Because it shows you the way in which to do it. And much much simpler too, than doing it in C. MUCH simpler. Like 100 times simpler. You cannot believe this unless you did it exactly that much. And this knowledge is practically ignored today. Which is very sad. The lie is that this takes too much time. Or is more error prone. The reverse becomes true, when you actually are doing it at that level, often. Every day for 10 years. There are a few counter arguments that are valid. But they are very very few. And these "problems" are a problem with the tool used to write in, and not a problem of assembly itself.
What is today biggest problem in computing, is that we require development to go fast. And then are unable to produce even one, or at most one or two, applications per decade, that actually don't have tons of bugs. And irritating peculiarities. We are producing massive amounts of completely ridiculous, slow and useless software, to "solve" problems that there is also a ton of other, similar software already written for solving, which are ALSO full of bugs, 100% worthless and 100% useless.
We should slow down, and take the time to make it so that it runs fast, and is robust. Instead of making development go fast at producing a lot of irrelevant shit. I see that time is comming now. As the hardware evolution is slowing down.
Most programs today are just fads. They exist in brief moment of time, and everybody just accepts that they are shit, just waiting to happen, because noone want to learn how to do it right. Don't even try to tell me otherwise. Until software comes with a guarantie written on it, for how it's supposed to work, it will continue to be largely worthless.
What we need is NOT a new microsoft certification for how to get a Windows-ready bumber-sticker on your plastic. We need for the industry to require guaranties for how software is supposed to work, and for them to pay for damages they cause, before they are allowed to take money for it.
Only by having such requirements, will we ever be able to produce scientifically sound software. And that would help all of us. Software would become worth something again. And it would also force software writers to write functional software, and not just another me too, see how cool I am, in your taskbar, even if you now hate me for it - kind of software. And the OS must of course be the very first software to come with these guaranties. This will make software as real and worthy as it should have been, in the first place.
Such requirements would focus development to solving the problems they are supposed to solve, and not 1000 problems noone ever had. Yes, Visual Studio. I am looking at you. I am amazed and surprised you managed to restrict yourself from putting a talking animated wizard in there. Just to completely fuck-up my day.
By imposing a requirement for putting guaranties on software you want to be payed for, you could also let go a little of the protection imposed on software, by the OS, that severely limits performance, and in particular; development time.
But like this man says, in that video, it takes some time to recalibrate for another kind of thinking. So in that sense, he has an important point.
In addition. I could be wrong, but I am not sure that it's needed to teach our kids how to program. I think that real talent, will transcend whatever needs to be transcended, in order to learn anyway. What we see Casey do, and others. In fact the less we teach them, the better; is one of the thoughts I currently hold. What we should instead do, is stand back, and give them the chance to learn on their own. And the TIME to learn. Teach them, if needed, to learn by themselves. And not put obstructions in their way.
This has been proven too, by real experiments in the jungle of India where children have taught themself to university degrees, in quantum physics, iirc, with nothing more than a computer and the internet. At the age of like 10 or 12 years old or something.
And we see countless examples of that. Kids that in 5 seconds completely break the false "security" some wise-ass university professor spent his career researching. It's so beautiful to see stories like that. People think they are intelligent, and then they cant see things like that, coming from a mile away.
There are a lot of radical approaches, but most of them are not as new as you might think.
This seems true, to me. But fewer people work at those things, perhaps, or are less visible.
I wish we would teach children to doubt more. To question more. Question everything! I wish that we would teach the children to not be afraid to make mistakes. That the more mistakes they make the better it is. Some mistakes are of course lethal and should be avoided, for the longest time ;-) But when it comes to learning stuff like science, and the unknown, it's the only way to go.
Fact is that we are very clever to hide this. That many of the things we discover, are done by stupid hacks, and pure luck. In unawareness. That we then spend the aftermath to understand. That's how physics is done too. We think of those people as geniuses, but most of them are stupid as fuck, just like everybody else. They see some experiment, and try to explain it. It is not, and has NEVER, been the other way around. That you first learn how to do it, and then you do it.
We tell each other a lie. We hide our mistakes. We pretend we are so advanced now. It is a very comfortable lie.
For well established facts, learning or at least to have access to them is good, of course. But for discovery of new science, especially in computers, the fastest way to go is by making mistakes. It is good that your program crashes, if it means you made an error. Sooner the better.
That's also, I bet, the reason why quantum theory holds, yet noone understands it, because even the experts are dumb as sauerkraut. And I don't really mean that as a joke. It is literally true. And a wise man knows it too, about himself.
For a computer science student to be afraid of pointers and program crashes, is pathetic. It's like a chemistry science degree student being afraid of H20! It's fucking ridiculous!
But instead of embracing knowledge, we accept: SLOW software. 10000 known, and 1000000 unknown security breaches. Constant updates, insane restrictions, years wasted reading retarded information for how to do the simplest of things. And so on.
Other than that, it is a toy.
My phone, that costed like 1K dollars, is amusing to observe, as it tries to take a picture, before the photolight goes out, and fails every time! It is the same with the autozoom. It zooms beautifully, and then you click to take the picture, and it misses the zoom, LOL. It happens every single time, maybe I am using it wrong, haha, and I know it's not just my phone, because I traded it in, for the same model, and it is exactly the same. It's very fun to watch. I hope that isn't the rule with these things, but I rather not pay another $1K to find out.
Believe me or not. But the PC era is far from over. The PC is here to stay for a *considerable* time. A few days ago, I needed a flashlight for re-assembling my mainboard, after overhauling it, and I used my phon as a flashlight.
The battery was full when I started, but it failed within like an hour. For 100 other reasons, it's a very very nice looking, but utterly pathetic "computer".
So when you read that Google is going to replace all of the worlds workers, what you should read is "horse manure". In 5 years, you gonna need to hire a person just for changing your batteries.