The development of the IBM OS/360 operating system over 30 years ago was a monumental undertaking. That effort spawned many technical innovations, now widely copied. In spite of technical successes, the OS/360 was a management failure. "The product was late, it took more memory than planned, the costs were several times the estimate, and it did not perform well until several releases after the first". Dr. Brooks published many of his lessons learned from managing this project in the book "The Mythical Man-Month: Essays on Software Engineering". The book remains one of the best-regarded texts on software project management.
Today, situations similar to the OS/360 remain all too familiar.
There are several techniques for cost estimation, but the 2 basic approaches are top-down and bottom-up. In the top-down approach, cost is derived from a business analysis of the major project components. In the bottom-up approach, cost is derived by accumulating estimates from the people responsible for various components.
The primary techniques for cost estimation are:
- Ask an expert in the domain of interest
- Analogy to other projects
- "Price-to-Win" strategy
- Resource availability
- Parametric models
There are over a dozen parametric models available in the literature. A good survey paper of parametric techniques is found in: Information and Software Technology, Vol. 34, No. 10, Oct. 1992, F.L. Heemstra, "Software Cost Estimation".
"Cost does not scale linearly with size", is perhaps the most important principle in estimation. Barry Boehm used a wide range of project data and came up the following relationship of effort versus size:
effort = C x sizeM
This is known as the Constructive Cost Model (COCOMO). C and M are always greater than 1, but their exact values vary depending upon the organization and type of project. Typical values for real-time projects utilizing very best practices are: C=3.6, M=1.2. Poor software practices can push the value of M above 1.5. A bit of plotting for various values of M should quickly reveal the value of utilizing best practices! Other factors that can influence M are: application experience, leadership capability, new environment and tools, requirements uncertainty, software reuse.
One fall out of the COCOMO model is that it is more cost effective to partition a project into several independent sub-projects - each with its own autonomous team. This "cheats" the exponential term in the COCOMO model. Partition strategies include domain analysis and partition by CPUs.
Cost estimation is an iterative process. Precisely predicting software cost on large projects is an intractable problem - even within the most stable environments and mature organizations. Thus, software estimation needs to be performed several times as a project matures. Each rework of the estimate should be more accurate because more information is known.
The human factor is the dominant parameter in cost estimation. Cost can be driven down substantially by utilizing quality analysts and programmers. Based up his parametric studies of 63 projects at TRW, Boehm was able to obtain a quantitative advantage for utilizing quality analysts and programmers: having a very low skill level among analysts or programmers will cost twice as much as having a very high skill level. If both analysts and programmer have a very low skill level, costs can quadruple. (Hint: training is good.)
Other factors that have a large cost impact are: required reliability, and complexity. Stringent reliability requirements can double costs as can high complexity. If both high complexity and reliability are required costs can quadruple. (Moral: taking time to simplify pays handsome dividends.)