Handmade Hero»Forums»Code
42 posts / 1 project
About the Twitter/VS rant
notnullnotvoid
Sounds like you're saying it has a lot to do with developer competency...


Competence always varies within any large organisation. That's just natural - anywhere.
I'm sayihng it has to do with propper leadership (call it "adult supervision" if you like) making descisions that are "non politically correct", or rather motivated by a philosiphy that is counter to the PC culture.

Even if a leader thinks this way, they may not be able to action on that properly, if the culture around him would not accept it. So we're back to culture again. PC culture has been increasingly invading the corporate world as of late.
42 posts / 1 project
About the Twitter/VS rant
AsafG
There's a difference between individual competence and cultural competence.
...
If you have a competent individual that is part of a competent culture, but is working at a company that is largely made up of programmers who are part of an incompetent culture (regardless of the programmers' individual competence), they would be pretty miserable. It would be hard to do good work, and there would be a lot of pressure to join the incompetent culture.


Totally agree with everything you said. I had this exact experience in my last job.
Sounds like we're in aggreament :)

I'll add to that:
Incompetent cultures (as you put it) can even still care about the competency of it's constituents, filtering out juniors, pushing up people's competence, putting pressure on them to improve, etc. - and yet, at the same time, still be incompetent on other aspects. It can simply value the wrong things in it's heirarchy of values. It can just be a value-system dissorder.
I think that is much more common than people reaslise.

AsafG
These are the people who can help you get the VS startup time down to "under 10 seconds"


That would be very unlikely - as, again: Loading time is not a constrained localised thing you can just optimize using pratices of optimisation. It tends to involve initialisation of MANY sub-systems within the software stack, which would be "owned" by different teams in the organisation.

So, optimizing loading time is WAY more "political" than "technical"....
42 posts / 1 project
About the Twitter/VS rant
It's very convinient and familiar to think in terms of personal accountability.
It's convinient because we get to be able to point fingers at individuals, as the cause of the problem, and that makes the problem seem tracktible. It's also satisfying...
It's familiar because that's how our western culture trains us to think all throuhout our lives.

It's much less familiar and convinient to think in terms of systems and structural problems.
It's inconvinient because you then suddenly don't get to be able to point at individuals anymore, and that makes the problem seem less tracktible. It's also less satisfying...
It's unfamiliar because we're typically never trained to think like that.

But the truth of the matter is, that especially as soscieties and cultures grow in size, more an more of the problems that manifest are structural and systemic.
They are natural biproducts of the ways in which we've set up the social game itself.
11 posts
About the Twitter/VS rant
Edited by VROOM on
It feels to me that even if you accept all the arguments for why inflating an organisation's number of participants brings about inevitable decay (sort of like the Tower of Babel), and it's not the individual's fault, Casey still has a point about them (Microsoft) not caring.

Microsoft has made a few versions of its operating system that has run on whatever they have needed it: as a DOS extension, OS for IBM PCs, Media players, PDAs, and the Wikipedia page even mentions "satellite navigation systems" if you believe it.

I expect making the kernel run reliably (and perform well) is much more effort than fixing the startup time on Visual Studio or having the watch window update quickly (even though it used to do both in the VC6 days). Now, before anyone starts talking about it being impossible to turn a decayed codebase around as compared to a green-field new project like Remedy, remember that the NT kernel codebase is even older than Visual Studio, and Microsoft still finds a way to work with it. In fact, the kernel has much tighter backwards-compatibility restrictions than anything in Visual Studio ought to have, and I can't even imagine the level of politics and business decisions concerning the core OS.

So, I'd argue that Microsoft is capable of marshalling its resources to solve the problems in Visual Studio if they thought it was important (or as Casey put it -- if they cared to do so). They don't. Whether it is because it makes no sense financially or strategically, or because they think this is okay is irrelevant from the consumer's point of view for the same reason that one wouldn't care what organisational troubles lead to the restaurant you were eating at serving you unappetising slop. The customer cares about the results.

Also, while I agree that culture and values are king practically everywhere you have a group of people, I don't think we should use "Large groups have a hard time aligning on values, so inner conflicts seep into their work and weaken it over time" as an excuse for poor execution. It would be very easy to brush off any incompetent response from a company/government/any large body as not their fault if we take this stance to its logical conclusion which is "This is just the way things go when they get this big".

Consider if this would be a helpful attitude in more dramatic circumstances -- say a super-bridge collapsed or airplanes started falling out of the sky. I don't think people would give the builders a pass because the organisations have become so large that the political and managerial effort required to run them has outgrown the effort put into engineering. They would still be held accountable for the quality of their work.

Now, I realise that Visual Studio start-up time (and watch window stepping lag) is not a matter of life and death, so perhaps we should not hold them to the same standard, but before you resign yourself to thinking that "This is is what happens when organisations get big", think about all the times where a truly massive collection of people have organised to achieve impossible things:

  • The organisational level during every war effort - I don't agree with the goal of war, but the amount of industrial organisation needed to sustain it is staggering. If you haven't read about the levels of industrial reorganisation during world war two, it's a fascinating read.
  • The first Moon landing required much more engineering than whatever magic loads the modules for Visual Studio. Or even the Visual Studio installer, which hasn't been reliable for me since version 2010!
  • Look at the level of organisation on any disaster response of even moderate scale. People can really get organised even without any of the infrastructure we take for granted day to day.
  • Take a look at what it takes to fabricate a modern CPU. Frankly, it is crazy what you need to be able to do, crazier still if you take a look at what the supply chain looks like. This holds true for every auto and ship maker out there as well.


Large organisations can and have organised themselves to do much more difficult things than resolve code ownership issues and fix performance problems that have accrued for over a decade. It's just a matter of priority and values. This is why I think Casey can rightfully say that they don't care. The performance issues are much lower priority than, say, pushing Azure for business, so the effort ends up going there.
42 posts / 1 project
About the Twitter/VS rant
Edited by Arnon on
VROOM
Consider if this would be a helpful attitude in more dramatic circumstances -- say a super-bridge collapsed or airplanes started falling out of the sky. I don't think people would give the builders a pass because the organisations have become so large that the political and managerial effort required to run them has outgrown the effort put into engineering. They would still be held accountable for the quality of their work.


I never said that Microsoft at large should not be helf accountable(!).
And I was not "excusing" their failure to adress the issue with VS.

All I was saying, is that the argument that the cause of this failure should be placed exclussively on the feet of the individual software developers of the VS team, is just complete and utter non-sense.
There are many other factors at play here, that are very obviously far more to blame for that, none of which have anything whatsoever to do with the competence of any particular developers in that team - of which C.M has no knowledge of as is purely conjecturing - knowingly.

About your other remarks:
There's nothing more galvenising then a common threat - that's plain and obvious.
People's ability to coordinate on-mass in crysis situation, was never in question in anything I've said here.

But to even suggest that this invalidates anything I've said, is pure Reductio ad absurdum and you know it.

Going to the moon is another case where it's pretty obvious why people can coordinate to some extreme degree.
It was a lofty extreme goal of advancing humanity to the start - who wouldn't galvanise around that?
And in that case as well, people's lives were at stake. So, again - not applicable.

Whenever there is clarity and unity of purpose, people can and do coordinate extremely well in large numbers.
But to expect that kind of galvenisation and unity of purpose, for something as marginal as a software loading in less than 10 secinds, is just ludicrous....

Finally, I think I should make it clear:
I DO believe that Microsoft should be held accountable for this.
I DO NOT agree that assuming that developer incompetence is the cause is even remotely appropriate.

And I don't see it as a life-and-deth kind of accountability, so I can't accept any of the comparisons made here.
42 posts / 1 project
About the Twitter/VS rant
Edited by Arnon on
Loading time of any large software, because it inolves so many other teams, is the ultimate of nightmarish minefield politically - nobody is going to stick out their neck to be political targets of every other team - that would literally be political suicide - UNLESS they are already at the very top of the food-chain politically, as is the case with Blender.

The only way this issue with VS gets resolved, is if someone like THAT within Microsoft literally "dictates" to all the VS team heads that this is a priority - as an order from the top.

Unless and untill that happens, blaming developers is just missplaced frustration, and expecting them to resolve this is an exercise in futility.
Ryan Fleury
204 posts / 4 projects
Working at Epic Games Tools (RAD). Former Handmade Network lead. Maker of Hidden Grove.
About the Twitter/VS rant
Very interesting discussion.

I think what probably everyone can agree on is that there are a multitude of factors at play—I'm sure it's not actually true that 100% of developers on the Microsoft Visual Studio team are incompetent, but rather that if they are competent, they are probably not being put in a position to apply that competence in any real way.

My short experience at large companies mirrors this—it doesn't matter if you know a better architectural design for some particular system. There are other factors at work—all of your decisions are subject to micromanagement by superiors and project managers who are probably managing people more than software. The incentives of those who are managing and engineer incentives do not necessarily align.

Additionally I think that Arnon makes an interesting point regarding larger teams, and the kind of software that such teams tend to produce. I think that larger teams inherently lead to more complexity, and either that: A) Larger teams are always less coherent and fluid than smaller ones, or B) Our ideas about how to make teams scale are just completely wrong.

I don't know which is the correct answer, but what is definitely true is that there is unmistakably more engineering hours available on a larger team, and that larger number of engineering hours has been spent to make a product that, to a user's eyes, is in basically every way worse than the alternative (RemedyBG). The solution is unclear to me... Do you just opt for smaller teams? Or do we try to solve scaling problems better?

For me I'm partial to smaller teams that work well, but does anyone have any ideas about scaling teams to maximize—or reduce minimization—of individual engineer contribution and productivity?
11 posts
About the Twitter/VS rant
I'm not talking about one programmer fixing it all, though. I'm talking about Microsoft as an organisation deciding that it is important that their updater can never ever mess up your installation of Visual Studio. Or that it's important that the watch window get updated immediately. I'm talking about Microsoft "deciding" this in the same way that Sony decided that they will do what they can to eliminate load times in their next console -- as an organisation.

I would bet that if the performance issues Casey was talking about were a serious threat to Microsoft's existence -- I'm talking Microsoft going bankrupt in a year if they do not fix them -- Microsoft would fire half their staff and rewrite Visual Studio from scratch to be fast, if that's what it would take for the company to survive. But there are no such pressures on the company, and they either have no intrinsic motivation to organise themselves around setting the performance bar higher, or they can't for some reason, be it lack of skill, bad culture, or something else entirely.

So, when I say that Casey has a right to say "Microsoft doesn't care" I mean that fixing the issues he talks about doesn't seem to be a priority for Microsoft as a company. Microsoft has more than enough resources to make big moves if they needed to.
Miles
131 posts / 4 projects
About the Twitter/VS rant
Edited by Miles on
Blender has a single BDFL (like python) all thoughout it's existence - something you NEVER get in ANY commercial software. Someone who can veto-out certain descisions that would negatively impact Blender's loading time, and that sticks around for decades. Name me a single commercial software product that has that....

Out of just the handful of commercial programs I have installed right now, there's at least Aseprite, FL Studio, and Sublime Text, not counting a dozen or so video games.
Blender can very well allow itself to even stagnate for very long time durations, something that NO commecial software can ever afford.

Except, as you well know because I specifically pointed it out, it doesn't.
making descisions that are "non politically correct", or rather motivated by a philosiphy that is counter to the PC culture.

I don't think you know what the phrase "politically correct" means.
These are the people who can help you get the VS startup time down to "under 10 seconds"

That would be very unlikely

Well from watching Casey's demonstration, it seems to have happened already.
All I was saying, is that the argument that the cause of this failure should be placed exclussively on the feet of the individual software developers of the VS team, is just complete and utter non-sense.

Nobody has made that claim. What are you even responding to exactly?

I could nitpick further, but the basic problem I'm seeing is that, as far as I can tell, there's no coherent and specific case you're making here. You've said that software quality is purely a broad cultural and economic problem that can't be fixed, overcome, or laid at the feet of individuals or specific organizations, but also that you believe Microsoft should be held accountable, and also that the key to making good software is having a leader (i.e. an individual) that will select other competent individuals, but also that having such a leader won't necessarily help in making good software because "the culture around them might not accept their decisions" (what???), but also that it's actually okay for VS to be super slow anyway because you personally don't launch it very often, etc. You're all over the map in terms of the claims that you're making.
42 posts / 1 project
About the Twitter/VS rant
Edited by Arnon on
"... so what you're saying is..."
Please...

You're just trolling at this point.

My main point is: It's complicated!.

Trying to reduce that complexity to any single variable is just plain simple-minded.
It reminds me the Jordan Peterson vs. Cathey Neuman interview...

I know very well what PC culture is.
If you think your idea of it is different, please elaborate.

I'm holding MS as a corporation accountable - not it's foot soldiers.

In any group of social animals, the higher an individual is on the dominance heirarchy,
the more power they have to affect change and counteract the natural gravitation of devolution into chaos.
That's far from being a controvercial statement - it's self evident.
At the same time, that's qualified for cultural differences: Even a top-dog is unlikely to radically change a culture.

Unless a coherent unity of purpose is maintained all throughout a product's filetime,
it is natural for chaos to erode it over time.

Already right there you can see a multi-variad/multi-dimentional dynamic.
I'm not going to detail a solution to MS's problems as it's not my job and they're not paying me for it.

I don't think I'm being incoherant - I'm laying out a number of factors at play that C.M seemed to have ignored in his rant - that's it. How these factors interplay? What is the dynamic between them? That's not my role to say. I'm just enumerating the ones that came to mind for me as I was listening.

As for the examples you mentioned, I think I should have qualified my request to "large long-lived commercial software", not just "commercial software". And I don't know of any of the examples you mentiones, so I can't comment on those, but I guess they won't qualify. But to the degree that they would, I'd say that they'd be the exception to the rule.
42 posts / 1 project
About the Twitter/VS rant
If we were to expand the discussion to a more general software-quality discussion,
if I were to be asked what "I" think "should" be done, I couls start thinking in those terms.

I think that J.B's rast about preventing the impending appocalypse, was understating the probelm.
It's not just software - it's a way bigger, wider, deeper and more subtle, complicated and fundamental than that.

The software industry is still very young in relative terms - it's what, half a century old? Probably less?
So there are conclusions and lessons that we're just now starting to barely become conciouse of.

But here's a few that I think have been largely evident by now, as predicates for software quality:
1. Developers have to be payed continuesly and appropriately.
2. Software should not be tied to commercial incentives (i.e: share holders of corporations).
3. Software evolution should be skeptical about trendiness and hype.
4. Releases don't need to be at consistent intervals (in fact they probably shouldn't be).
5. The culture needs to be a lot more complexity-averse than it is right now.

I can go on, but we can start with those.
Some are curtural.
Some are structural.
There will probably be conflicts between them.
There probably are/have-been better/worst ways of getting those - they should be analysed.

One way to getting #2 is having the software itself released for free (not necessarilly open-source, these are orthogonal), while the monetization is achived by other means.
The Unreal Engine model is an interesting one to look at.

People working on "foss" software should be paid for it consistently in some fasion.
The Blender Foundation is an interesting model to look at.

Consistent relase schedules are a bad idea in my estimation, because they virtually guarantee that every release will contain some/many components at a very immature and/or incomplete states.

This can go on, but should probably be done at some other threan than this...
11 posts
About the Twitter/VS rant
Edited by VROOM on
I’m not convinced that frequent releases need necessarily lead to software bloat, though. First, you can space out releases however apart you want. We’re used to Visual Studio versions being years like “Visual Studio 2020”, but there is nothing forcing Microsoft to name them that. They could have used any versioning in their name they felt like and released every year and a half or two years even.

But say they don’t want to do that because it’s more profitable to release year after year. They could still decide that every few releases they’ll add some very small set of highly requested features and focus all the other time into fixes and performance improvements. There are two reasons why this can make sense:

1. Once you cover the bulk of the features that most developers will use, anything else you add will be used by smaller and smaller minorities over time. But performance improvements and fixes to core features will benefit everyone. Even the people at Microsoft, who I assume use Visual Studio to develop/debug to some extent.

2. I don’t know many people who won’t upgrade for improved performance. Now, obviously that’s anecdotal, but looking at the number of people who update their mobile apps the moment a new version becomes available when the only changes it lists are “Fixes and performance improvements” makes me hopeful that updates will be desired by people if they are certain that the new version product is an improvement, the update won't break anything, and the update itself isn't an impediment to getting work done (so, no 50 minute updates).

But, say Microsoft absolutely needs to add features in every new version, they can add features to any part of the product. Visual Studio ships a compiler, an IDE, and a debugger, among other small tools. You can make significant improvements to any of those, so it’s not like they are threatened of reaching some perfect product state that can’t be improved. Here’s some examples of features, which I believe every developer would love to have:

  • Debugging multi-threaded code sucks. It’s obviously doable, but not as straightforward to do as single threaded code. Microsoft could have opted to add something that will help people do it. Maybe allow them to visualise which threads touch which part of memory and when in some understandable fashion, or do something smarter than what I can think of in 10 minutes. My point is that nobody has facilities for debugging multi-threaded code as simply as single threaded in an integrated environment with a straightforward UI. I'm not talking about making multi-threading synchronisation bugs magically easy to fix, I mean that there is plenty more busywork involved into tracking down multi-threading bugs than what you would do to debug single threaded code. If Visual Studio were to cut down on that busywork, that would be a nontrivial quality of life improvement to many developers, and would cement Visual Studio's place as “best debugger out there, period”
  • Debugging optimised code sucks. Yet, Microsoft control the compiler, debugging symbols format, and the debugger. Perhaps they could make an optimised build (or “reasonably optimised build”) that can be stepped through in a debugger. If that’s too much to ask, maybe it can keep enough information about the decisions the optimiser made to tell you “Hey, I can’t show you where this variable is, but here’s what I did to it” and then you could figure out that you have it in a register or something. I’m not sure if this is feasible or not, since I have no experience with production compilers, but I have written expert systems that have to be able to explain their decisions succinctly (not dumping the whole pipeline of decisions made), so maybe something can be done.
  • The compiler could get faster and they could not touch the IDE for a release and it would still be fine, I think.
  • All the performance bookkeeping Casey does in Handmade Hero could be something Microsoft do for you. They know where all the code is - they compile it, and they can collect all sorts of metrics about it if you enable a compiler flag. Microsoft could even partner with Intel and NVidia and provide VTune and Nsight inside Visual Studio natively, not as plugins that sometimes behave oddly.


I’m not saying this would be easy to do, but it’s not some impossible thing that we don’t know how to do either. Any release could have had some of this stuff, or much better things that I’m not thinking of. Like I said, Visual Studio ships a compiler, debugger, and IDE — that’s a big surface area you can make improvements to. So, I don’t know why a hypothetical manager/engineer would opt for over-engineering or planned obsolescence unless the bulk of the organisation were either corrupt or incompetent to such a degree that they can’t do it.
42 posts / 1 project
About the Twitter/VS rant
About regular scheduled released:
My point is that different aspects or features of any software, have their own maturation curves and rates.
So any given point in time is a snapshot of partially-overlapping curves at different levels.
The likelyhood that an "optimal" release time for any combination like that would ever coincide with any regular schedue, is probably approaching zero.
So to the degree that there could ever be such an "optimal" time, it would probably never manifest at any regular rate.
So to optimize the quality of what gets released, it's better to align the release times with whatever the state of the curves is, not the other way around, and not to whatever the marketing departments thinks.

42 posts / 1 project
About the Twitter/VS rant
Edited by Arnon on
What I meant by the analogy to planned obsolescense, is that for any given feature/capability, there's some zenith point beyond which there's nothing that can be added to it that wouldn't damage it in some way. This point is rarely identified properly, and even when it is, there are often factors that push redundant complexity into it anyway, to satisfy some whim coming from external factors, like marketing, share-holders, hype, fashion, etc.
These are typically pushed onto managers from above or from outside.
They may even reaslise it's a bad move and try to fight it, but will not always succeed - again, due to internal politics.

Free software is susceptipble to hype and fashion, but not to share-holders (as there typically arn't any), and much less to marketing (it there is any).
42 posts / 1 project
About the Twitter/VS rant
Edited by Arnon on
And yes, MS is very much able to improve performance of VS - they have done it in VS 2019 in the compile-time domain (from what I saw).
But performance optimisation is never free - it almost invariably increased the cost of maintenence after that, as optimized code is almost always more complicated, and it makes it more resistent to change as optimized code tends be more coupled.

In my prior job it was very evident that felxibility and performance/efficiency are pushing in opposing directions.
It was also clear that the more aggreament can be formed across a wider span of people/teams, the more potential there is for increasing performance. Which is to also say the other way - the less aggreament can be formed, the less possibile it is to increase performance, because the code needs to account for more possible inputs.

And that's a purely political dynamic.
It has nothing whatsoever to do with any technical aspect or competence of anyone.

The more an organisation's value system favours flexibility over performance, the more performance will take a back seat.
And the less aggreament can be reached, the higher the value of flexibility will become, because you can't know exactly what's going to come your way, so you gravitate towards valuing flexibility overall.