Get in touch
Thank you
We will get back to you as soon as possible
.pdf, .docx, .odt, .rtf, .txt, .pptx (max size 5 MB)

20.10.2022

6 min read

Tips for Accurate Software Cost Estimations

Developing high-quality software is hard. Developing it under a commitment to a given timeframe aligned with a software cost estimation is even harder.

One of the greatest “holy wars” in software engineering certainly resides in producing an accurate software cost estimate that satisfies stakeholders and engineering teams. Wrong estimates in custom software development often lead to changes in release schedules, which in turn creates a headache for everyone involved in developing and delivering the working software.

This fact makes accurate software cost estimations one of the most valuable skills for people working in technical leadership teams. But also, it is equally important for the stakeholders to understand how good software estimations are produced.

Let’s take a serious look into what makes a good software estimation and how to produce one.

What Makes Cost Estimations So Difficult?

There are 2 main causes why software cost estimations go wrong:

  • Ambiguous and incomplete requirements
  • Over-optimistic nature of people’s assumptions

Incomplete requirements are pretty straightforward - something is missing. Ambiguous requirements are subjective and open to misinterpretation -  supporting “large file uploads” could mean 50 Mb, 500 Mb, or even 5 Gb files. The less you know, the more uncertainty and the greater risk of changes you’ll have down the road. Here is a simple chart that shows the dependency between the level of detail in software specifications and the possible costs of the development.

unnamed.png

Better specifications always lead to better estimations

The second thing to take into consideration is the risk of underestimation due to an engineer's bias. No matter how much information is provided about the software’s future implementation, engineers often tend to estimate the work presuming that everything will work out as expected along the road. This is a dangerous and fallacious way of thinking because it excludes uncertainties that are present almost to the very last phase of the development:

  • Issues always come up along the road
  • Testing almost always takes longer than expected
  • Requirements also change quite frequently based on the feedback
  • Systems become obsolete even during the development

There are a couple of methods to tackle the underestimation problem, like Model-based estimation, the Monte Carlo method, PERT (Program Evaluation and Review Technique), and so on. Depending on the complexity of the project and budget/time/resource constraints, you can choose how much math and which parametric models to include in your estimation process.

For this article, let’s stick to a high-level estimation effort; hence the simplest one — PERT. You may also know it as arithmetic mean value calculation.

Estimate Effort = (Optimistic + Likely + Pessimistic) / 3

The following approach will help you to mitigate the risk of underestimation, at least to a certain extent.

The Fundamentals of Cost Estimations

The basis of every estimation is the decomposition. First comes the decomposition of the estimation process itself. We can describe the process in a set of steps for a better understanding of ways to handle the task.

NOTE: The steps are written in sequential order, but the nature of the process is highly iterative, and the parties involved may iterate one step or another based on the incoming information along the way.

  1. Gather and analyze software requirements
  2. Define the elements that have to be implemented and the overall readiness of the existing software.
  3. Estimate software size
  4. Estimate software effort
  5. Calculate software costs
  6. Determine risks

Each process phase can be decomposed on its own, and it is especially related to the estimation of software size and effort. Risks and requirements are beyond the scope of this article, so let’s discuss the effort, size, and cost estimation a bit more.

Decomposing for Effort and Size

The software effort has to be decomposed in a fashion such that the person responsible for the estimation understands the following:

  • Number of sequential constraints that define the time of the effort
  • Number of independent tasks that define the amount of manpower

Developer hours and months are not interchangeable parameters (well, they are, but only to a certain extent). This is why you need to understand how to calculate each independently and then sum them up into an overall estimate.

Under the sequential constraint, one should understand the dependencies between the system components that are blocking the parallel implementation of the system. That is, no matter how many people you assign to the product implementation, the single component cannot be implemented before the one that it depends on will be finished.

The number of independent tasks allows us to calculate the approximate manpower to use in its full capacity during the parallel implementation of components.

Decomposing for Costs

Decomposition is also helpful when you proceed to calculate costs. One of the most reliable methods to do some math here is the Back-of-the-Envelope estimation. Much has been written regarding this method, so I won’t get into details here. Suffice it allows us to calculate disk space, expected CPU loads, required network bandwidth, and so on. Obviously, these points factor into the overall price for the hardware you’ll need to run the product.

Every engineer has to know or at least refer to the following table with latency numbers, and understand the power of the two concepts in engineering and availability numbers (SLA) of the systems he or she will work with.

Estimation Methodology (Spoiler: they’re all pretty much the same)

Now, once you have determined the requirements and decomposed the system into modules/sub-systems/services, the estimation itself comes into play. You may have read a couple of books on the estimation already, but there is always one thing in common between all the suggested methods — the experience of the person conducting the estimation.

99% of the estimation methodologies are experience-based: either it is an expertise methodology, estimation by analogy, model-based methodology, rule of thumb, and so on. Everything comes from experience gained by building similar systems.

What you have to do is to take a decomposed module and map it to something similar that was already built in the past. This could’ve been done by you (experience-based) or by someone else (expertise-based, estimation by analogy). There are three things to keep in mind to make the estimate accurate:

  • Understand the estimation context: software architects and software engineers operate in different contexts; hence, they estimate size and effort differently for the same product.
  • Decompose the system to a level that you can estimate based on experience and not by guessing
  • Understand the nature of over-optimistic assumptions and calculate the mean value of the estimate by providing optimistic, pessimistic and likely numbers (PERT).

Yes, it’s highly recommended to hand the estimation over to the most experienced and qualified person in the team.

Some level of uncertainty is always present throughout the entire development process. It is a common practice for engineers to add 20–30% to the overall estimated time as a safety net to cover all contingencies.

Summary

So, what do we have in the end? Let’s point out the key action items you need to work on to deliver accurate software cost estimations:

  • Requirements. The clearer they are, the better the estimate will be. Keep pushing stakeholders to provide additional information and continue to iterate until you are confident and satisfied that you have all available data.
  • Decomposition. Decompose, decompose, and again decompose. Granularity is the key to accurate estimations.
  • Calculate mean values. Don’t fall into the trap of making optimistic assumptions. Bring in your optimistic numbers, but also think of the worst-case scenarios, mix in likelihood based on the experience, get the arithmetic mean value, and proceed with it.
  • Protect the estimate from uncertainty. Add 20–30% to the estimated effort to cover contingencies. It is always better to overestimate than underestimate.
  • Get the most experienced opinion available. Experience is all we have to get any valuation of the effort, time, and cost. Make sure you involve the most experienced engineer in the room in the estimation process.
0 Comments
name *
email *