I think you're right to some degree - there is definitely more opportunity to work as a cohesive unit, and with the advent of Wikis and source control a lot of that manual sharing of information can now be handled, especially if design decisions etc are documented well.
I think there is still some truth to the idea though that you cannot just throw people at a project and expect it to get done faster.
You propose an interesting counterpoint to the arguments raised in The Mythical Man Month, but I believe the projects that are discussed in the novel are different enough from Linux projects that they cannot be compared against each other well.
For instance, the degree of specificity required by a typical large software project versus that of the Linux project is an issue that arises. My relatively ignorant view of Linux is that a problem presents itself, and an individual (or team) solves the problem to the best of it's ability. The same approach is used by small firms producing products. They do what they think is best.
Larger firms, on the other hand, are much more specific in their requirements. They don't turn to a developer and say "Hack until this works." The approach draws more on "You need to do this in this specific manner, no questions." The problem of communication, and lost efficiency, arises when those who do the legwork are separate from those who do the overall design.
Perhaps if a large company is better able to trust its developers to do the project well, then we wouldn't see the sharp decline in productivity. Of course, not all developers are created equally, but if you eliminate the need to communicate by sticking with broad outlines and general compatibility, then the issues do not arise.
Thanks for reading!
Not sure I agree with your assessment of large firms and their requirements. Every job I've worked at has a mixture of solid, well planned requirements, and then bugs that pop up and need to be fixed in enterprise applications ASAP - just make it work, pronto! The same thing happens close to crunch time as requirements change.
I think you and I agree on this point though, that it is easier to do work separately with minimal communication (as much open source software has) if the software is already fairly stable and people are just working off on branches/components that are sectioned off from each other.
I am quite critical of large firms, so my bias does appear in the writing (I didn't realize it was that obvious when i first wrote it...).
You are right in terms of our shared philosophy. Focused, distinct units work very well. If the need for communication appears, then direct questions to the head of an individual branch should be able to sort out the difficulties.
By the way, thanks to Andy for letting us use his comments as a basis for our discussion.
Your opinions are nice and everything, but whereas Mythical Man Month was a scientific
study, you fail to provide any statistics that backup your loosly connected theories.
Let's take, for example, the following claim you made:
"Software development productivity has improved enormously in the past 30 years. Today's developer has fast hardware, dynamic scripting languages, instant access to tools, documentation, and advice over the Internet, and decades of experience in development process to draw on. As a result a contemporary developer able to produce a lot more "function points" per day than a developer could produce 30 years ago, or even 10 years ago at the height of the Internet bubble."
Let's dissect this paragraph:
"Software development productivity has improved enormously in the past 30 years."
Define "enormously." And how do you know this to be true?
"Today's developer has fast hardware, dynamic scripting languages, instant access to tools, documentation, and advice over the Internet, and decades of experience in development process to draw on."
Again, where this list comes from seems magical, with the implication being that all of these things have had a direct effect on a still-yet-to-be proven dramatic improvement in software success. For example, dynamic scripting languages are more prevalent on the web than they are in primary building blocks of corporate applications (this is slowly changing, which is good news). One could even argue that dynamic languages are specifically suited for rapid development/prototype on the web. Perl isn't so much different from PHP, and is in fact probably more relevant to corporate applications (and it's still being developed with Perl 5 being a big rewrite with YAML, etc.), yet Perl has been around for a while and hasn't necessarily fixed productivity problems.
"As a result a contemporary developer able to produce a lot more "function points" per day than a developer could produce 30 years ago, or even 10 years ago at the height of the Internet bubble.""
Languages have evolved, yes, and thank goodness! But has success evolved? You don't make a very compelling argument.
You do make some good points, however you are talking about team sizes and equating team sizes to adding people to a late project.
Brooks' Law says that adding people to a LATE project makes it LATER.
It doesn't say big teams are bad. You are talking about team sizes.
Your point of view makes a lot of sense.
I've always been surprised by the way a company works (in terms of software projects). I've seen thousand of guys in open source organizing themselves and achieving great results whereas I've seen less complex projects fail in the enterprise world.
I come to the conclusion that a company could work with the methodologies used in the open source world — as you mentionned — if it provides such great results.
But finally I've come to a different conclusion. The enterprise world and the open source world are fundamentally different, and major difference is: the willingness of developers to participate in the common effort.
In the most company, developers consider only their job as a job, they don't share a specific enthusiasm with the final outcome — an the open source developer choose to participate to a specific project.
This is why the open source methodology won't work for most companies and they'll have to use the classic methodology of software developement (i.e. requirements, specs) in order to control the outcome quality. No matter the productivity.
I think that you are missing what Brooks's law really says. It basically says that serial tasks are can't be performed in parallel. That's why nine women can't have one baby in one month. But three women can have three babies in nine months, whereas one woman would take 27 months to perform the same task.
The more you can isolate and parallelize, the more people you can add to the project.
Be careful when you try to slain the werewolf with your new shinny silver bullet! Countless made that claim and failed.
I would say: feel free to ignore "The Mythical Man-Month" at your own peril! Although, you'll not be alone in the tar pit...
You propose that projects are limited by dependencies more than communication. I seriously doubt this is the case really.
Project planning is a pretty well known field, and ensuring that tasks are done in the correct order wrt dependencies is not something you have to invent. Eg take a look at "critical chain project management", which at least the last 3 SW companies I've been at have used.
What I see is limiting is not so much dependencies but complexity of products and finding out whom to talk to. If you are in an organization with 10 people this is not a problem. Currently I have 500+ co-developers, and a software product that matches that in scale. Needless to say this puts a very different requirement on communications than a smaller project.
You're kidding right?
Brooks' law: adding manpower to a *late* software project makes it later.
Linux has the enviable luxury of never being late. There are no schedules, no market deadlines to meet. It's done when its... actually its never done.
On projects for which there is a paying client or a competitor waiting to eat your lunch, ignore the core of Brooks' advice at your peril.
Adding manpower early to an understaffed project will improve its chances of on time completion but as sure as someone on the internet is being called Hitler, there will reach a point where you will get negative returns for every engineer you add to that project.
Your point is good that Open Source software project doesn't suffered from Brook's Law. But, as other may have mentioned, I want to add one more point to contradict your stand.
Those Open Source contributor are already familiar to the project by the time they've been promoted. Not only they have already become power user for the project before starting to give some code. But also, the selection process is quite complex. Those who make enough good patch submission (via e-mail, forum, IM, whatever it is.) are more likely to be promoted compared to those who are newcomer to the project. Who are be able to submit numerous patch should have pretty good understanding in the project. Time that those new "contributor" to spend time learning the project would be reduced.
But, on the other hand, those commercial software project doesn't have this kind of advantage. Their selection process may involved with some background interview, but this doesn't make candidates have some insight knowledge about the project they're going to join. So they have little knowledge by the time those candidates become full-time developer for the project. They have to familiar themselves to the project, from their colleagues, a lot. So the whole team has to spend some amount of time teaching new developers.
Just to point out some other characteristics which makes open-source projects differed from commercial projects, and made Brook's Law less relevant from open-source projects.
PS. Sorry for my weak English.
I Agree with you in almost 100%, but I think there is an concept used in all these analysis that is: "At a given time, any developer in a project can build n function points
The skills and the personality of a developer are very important.
can do the job of 5 developers.
I agree that tools have made development easier, however no one is mentioning the increase in software complexity and customer's expectations over the years. Lets take a website development project as an example.
Fortunately tools did improve the process coupled with better methodologies like DSDM and SCRUM to mention a few, however we can't really say life is easier now since work load has increased
I think one should uses laws like Brook's law as guidelines and not treat them as absolutes that cannot change. One should try to use different techniques and measure outcomes. Software engineering is not an exact science yet, mostly because it deals with people not machines.
A very thought provoking essay.
An analogue of the problem can be posed this way: Can three authors write a trilogy in one third of the time? Certainly with careful techniques, through careful story-planning, and minimizing the number of characters that pass from one volume to another, a trilogy can appear to be consistent, even though each author has only a passing knowledge of what the other authors had written.
Similarly, improvement in software techniques, better understanding of componentization, and a better software stack (three tier is now de-facto) leads to more understandable processes and better separation of concerns.
Software mashups is a refutation of the mythical man-month, in the sense that interesting programs can be stitched together with minimal communication overhead between development teams.
I solved the MMM NSB but I can't get funding. It took me 5 years to solve it. I had to rethink everything about programming to figure it out.
I figured out how to develop (build) software 1000x faster and reuse all code and reduce bugs by more than 90% and build code based on human thinking instead of machine thinking (meaning). If you wanna fund me contact me. Yes, I'm in Silicon Valley.
I think there is another key aspect to Linux (and most open-source software) that you are forgetting: in Linux, the user and the client are the same person.
When writing an enterprise application, they frequently are not.
This (obviously) has implications towards productivity and communication. While it is not a refutation, it still damages your argument to run an enterprise application like an open-source application.
The communication problem comes from the dependencies. If I have no dependencies, I don't need to talk to anybody. If I have to interface with 100 poorly documented libraries, I have to talk to the 100 authors. Or I have to reverse engineer the libraries, which is just a different way to be unproductive.
b.t.w. The Linux kernel is a poor model for this discussion. It shows that dependencies are relevant, but we have no idea how much developer time it takes to create linux. Open source developer time is "free, as in beer", so nobody needs to track it or find ways to improve productivity.
Happy New Year -- My youngest son, a computer-programming-major, brought home a copy of the 1995 anniversary edition of MMM and I found this post while looking for info on the revision (now 15 years old). I enjoyed Andy's article and the discussion, especially the insights related to the difference between open source projects and "bespoken" development products. However, I am disheartened that the software community is still debating points in a book that was already 10 years old when I was using it before said son was born!