
I've watched a presentation held by Joel Semeniuck. As his LinkedIn profile states, he is currently "Executive VP at Telerik", anyway, this post is not about Joel himself but about estimating how much does it take to complete a given software project.
I was thinking about this over and over again.
Actually - each time I gave an estimation while I was at Telerik - my team lead used to double it in order to get somewhat accurate estimation.
And I was pretty ashamed in the beginning. I always push as much as possible in order go be a great software developer, I stay night, I work weekends, whatever it takes. And still I can hardly meet my deadlines.
(Note: I wasn't working night / weekends @Telerik, this is a generalization of my whole life as a software developer.)
Why would that be? I am trying so hard and still no result.
(to be honest - there are lots of cases that I was able to meet deadlines, but they were about things I already knew how to do step by step so they don't count. Or do they?).
And it's the bolded sentence above that got me thinking.
And I came up with yet another software development law. Maybe it only applies to me so it is not actually a law that will cover the whole software development industry, but let me first say it and then I'll discuss it:
"You can estimate a software project using two algorithms - the algorithm that all Windowses prior Vista used to estimate file copy time and the way all Windowses including Vista and later use to estimate the time it will take to copy a file."
First of all what's the difference?
In Windowses prior Vista the way explorer will tell you how much time will it take to copy file / files would be somewhat random, then it starts to fluctuate during the copy progress. Ever started with 2 minutes, gone to 10 minutes? Well that's what I'm talking about.
In Windowses starting from Vista - the system will be much more acurate, except for one thing - it will not tell you how long will take to calculate how long would it take to copy the files. So you first wait for Windows to do some estimations by chunking the data it has to deal with. And then it will be more accurate in saying how long will you wait.
(Note: I probably miss a lot from their algorithms but this is the end user impression it leaves).
So basically if you start estimating like older Windowses - you will be looking at one large chunk of work and you will be able to estimate that chunk roughly.
If you give your estimation the way Vista and newer Windowses work - you will have much more acurate estimation as you would have chunked all the data upfront. And there will be much less room for errors. So in the ideal case you will have to split the big chunk into really really small tasks you know you can do in an hour.
The bad thing using the second approach is that when asked "What's your estimate on this?" you will need to say "Gee, I don't know, I will first need to take my time to estimate it, I'll then be more acurate.", "How much time would you need to estimate this?", "Gee I don't know. You'll have to wait for me.".
So to summarize:
Estimating like old Windows: "It will take 2 months and it may go up to 10 months. I may be able to complete it in 2 weeks though."
Estimating like new Windows: "Come back later. Or better - I'll call you when I have an estimation. I know you have a decision to take but I can't do anything about it yet.".
I am leaving you decide which one to use :). Maybe a mix of both?
P.S. If you think about it - this applies to known projects. Add some third party black box integration and you will fluctuate the same way windows fluctuates when it downloads a file - it tries to take into account a variable that is changing randomly - the speed of your internet connection. For such projects you can go from 1 hour to 1 year or even to work 1 year and understand that the project cannot be completed (well guess what - although Windows states it will get your file in 1 hour, it doesn't tell you what will happen if you decide to disable your internet adapter).
No comments:
Post a Comment