How long will my app take?

Drew CrawfordNews

This is an important question that’s vital to making important decisions about any software project.  Unfortunately, the state of the entire software industry is that time estimates are almost never accurate.  According to the IEEE, the international engineering standards body:

This is only one of the latest in a long, dismal history of IT projects gone awry… What’s more, the failures are universally unprejudiced: they happen in every country; to large companies and small; in commercial, nonprofit, and governmental organizations; and without regard to status or reputation. The business and societal costs of these failures–in terms of wasted taxpayer and shareholder dollars as well as investments that can’t be made–are now well into the billions of dollars a year.

For one thing, software is one field in which it’s incredibly easy to snowball nontechnical people, who are often the ones providing funding for the project. I’ve seen this story play out way too many times:

It became pretty clear what Dave was really supposed to do: simulate an installation of Knowledge Essentials, run into “technical difficulties,” gather technical requirements and send them back to the developers, stall for a few days while the developers rush to build a prototype, install the prototype while explaining “technical difficulties” are preventing it from being functional, and then work closely with the marketing team to find a good excuse to reschedule the installation for a few weeks later. As soon as Dave realized this, he stormed into his manager’s office and angrily proclaimed: you’re kidding; there is no way I’m going to do this!
Three days later, Dave was on a plane headed to The Netherlands. They say everyone has a price; Dave’s was a 20% bonus and an extra week’s vacation.

As margins shrink, there’s increasing pressure for even the best firms to employ tactics like this.  It’s become so commonplace that programmers joke about how much they can get away with over drinks.

But it’s not just dishonesty that’s the problem.  Even when everyone means well, schedules are almost always worse than worthless.  From Rapid Development:

If Las Vegas sounds too tame for you, software might just be the right gamble. Software projects include a glut of risks that would give Vegas oddsmakers nightmares. The odds of a large project finishing on time are close to zero. The odds of a large project being canceled are an even-money bet (Jones 1991).

In 1998, Peat Marwick found that about 35 percent of 600 firms surveyed had at least one runaway software project (Rothfeder 1988). The damage done by runaway software projects makes the Las Vegas prize fights look as tame as having high tea with the queen. Allstate set out in 1982 to automate all of its office operations. They set a 5-year timetable and an $8 million budget. Six years and $15 million later, Allstate set a new deadline and readjusted its sights on a new budget of $100 million. In 1988, Westpac Banking Corporation decided to redefine its information systems. It set out on a 5-year, $85 million project. Three years later, after spending $150 million with little to show for it, Westpac cut its losses, canceled the project, and eliminated 500 development jobs (Glass 1992). Even Vegas prize fights don’t get this bloody.

The dirty secret about the software industry is that time and cost estimates are worse than guesses.  Nobody wants to talk about it, because they’re afraid everyone will discover that the emperor has no clothes.

Our Estimates

We’ve worked at a lot of firms with the industry-standard terrible software schedules, and when we formed our own company, we knew we could do better.  At DrewCrawfordApps, we pride ourselves on creating the most accurate estimates in the industry.

We use a modified form of Evidence-Based Scheduling, a scientific method that gives us incredibly accurate insights into our software schedules.  We start by breaking every project into small tasks, so small that each task is at most a few hours.  Projects may have hundreds or thousands of tasks, from planning, testing, development, feedback, maintenance, and more.  This granularity helps us estimate projects and keeps us honest about the work involved.

Obviously the first ingredient to accurate estimates is having experienced developers write them, instead of sales people.  Our estimates are always written by senior-level developers, and we perform extensive cross-checking by independently estimating items and comparing the estimates.  It sounds obvious, but many firms don’t invest the effort in getting it right.

We then estimate each individual task.  But we don’t just trust the estimates we write–we correlate them against previous work.  So if we estimate a task at 30 minutes, we don’t write 30 minutes on the schedule–we write down the average amount of time we’ve historically spent on things that were originally estimated at 30 minutes.  We have a wealth of historical information and complex algorithms that do the number-crunching for us.  For instance, here’s my estimation history:

 

You can see that my estimates are slightly pessimistic–what I estimate at five hours is usually four and a half.  But I have a few really bad estimates and a few really good estimates.  Our software corrects my estimates, so when I estimate a task at five hours, it writes it down as 4.5 hours.  But it also builds a full probability distribution–there’s a 10% chance it could be as bad as 10 hours, and a 10% chance it could be as good as 30 minutes.

Determining Ship Dates

We take all of these probability distributions for all of these individual tasks, aggregate them across all of our developers, and we are able to forecast our ship dates the way meteorologists forecast the weather (but with a lot more accuracy!).  Here’s a completion graph for one of our projects:

This project is on schedule.  There’s an 80% chance we will ship on time, and a 50% chance that we will ship a week early.  (We’re the only firm that ever ships projects early.)  This is a live report, that’s always up-to-date, telling us exactly when the project will ship.

Developer Timelines

But it’s not just about measuring delivery dates.  Our advanced systems can also predict individual developers’ schedules hour-by-hour:

Here you can see that on Saturday I’ll probably be working on “core2”, but there’s some chance I could be working on other things.  There’s a 4% chance I will be blocked and waiting for information from a client or one of our developers, which is good information to know.  If that number gets too high, I can take proactive steps to reduce that risk and keep the process flowing smoothly.  We construct color-coded timelines like this for every project and every developer, which helps us predict and proactively address slips well before they occur.

It’s important that we block out time for particular tasks.  It’s possible to ship a 100-hour project in two weeks–in theory.  In practice, we work on many projects at once, and it’s important to balance our time commitments effectively.  We can’t just drop everything we do and work on a new project, as that wouldn’t be fair to our existing clients.  Tracking individual developer timelines on individual tasks forces us to be honest about the amount of work we have currently on our plate and our ability to commit to new projects to maintain our high standards to clients and customers.

Data Collection

Our estimation algorithms are only as good as the data we can collect.  If logging time becomes too much of a hassle for our developers, we won’t have accurate historical data, and that means bad scheduling predictions.  Thankfully, we’ve invested in complex proprietary tools that automatically log our time with minimal effort.  There’s one step to start work on a project which automatically finds the correct version of the software to work on, downloads it to the developer’s local computer, and starts tracking time.  We’ve made it more difficult not to collect data.  This lets us correlate every individual change to a piece of software with an estimation item, meaning that every time we work on something, we’re collecting data to improve our scheduling forecasts.  As a result, we’ve collected tens of thousands of points of data.

The Result

What does this mean for you, the client?  It means that our estimates are more than just pieces of paper.  Every date, figure, and number has a wealth of expertise and research backing it up, and that’s why we have the most accurate estimates in the industry.  We hit our targets over 90% of the time, whereas the industry average has recently improved to 34%.

Unfortunately, it also means that we lose some work when we claim, accurately, that a project will cost $25,000 and take four months to deliver and a competitor claims, inaccurately, that it will cost only $10,000 and take two months.  But we think it’s more important that our clients feel comfortable relying on us for accurate information about their software projects than it is for us to win every contract.

If you’re interested in getting an accurate estimate for your software project, get in touch with us today!