• Predictive Analytics for Improved Cost Management  



Blog



Original Post Date: Wednesday, March 16, 2011 ...wear the worst shoes. The cobbler was a master at his craft; he was just too tired to practice it when he got home from the shop.  Sound familiar? A disciplined approach to understanding (functional) requirements as well as analogous projects (with actuals) is our not-so-secret sauce. Why run the risk of creeping back up our career learning curve? There’s already enough scope creep to keep us busy. Plus, for you management types charged with prospecting, a consistent approach towards estimation is a great way to connect with people who've felt the pain of being the cobbler's kids. I recently reconnected ...
Original Post Date: Wednesday, March 16, 2011 In Parametrics is Free, I acknowledged receiving (too late) “you should’ve known to ask that” over the years. Quality control after-the-fact is fine; but it’s better and cheaper to take a systematic approach to quality assurance as part of your estimating process. The sheer volume of what we model can often keep us so close to the details that we are unable to step back and put on our QA hat on for a sanity check. Enter Quality! On a very large project, our team has introduced a few regular cross-checks, notwithstanding typical math check-sums.   A round table peer ...
Original Post Date: Wednesday, March 16, 2011 I’m not a golfer. But we’ve all heard one say “that’s why I play” after hitting a shot and feeling like it all came together. What “it” is, in terms of mechanics and timing, I’m not really sure.  In our own world of parametrics, it’s the feeling of adding value in that golden moment of facilitating decisions and forward momentum. We wear many hats: estimating, consulting, systems engineering...even cost accounting.  Building an AoA, ICE or ROM is where rubber-meets-the-road in regards to configurations and assumptions.  Not too long ago I was in a discussion with a number of Subject Matter Experts ...
Original Post Date: Tuesday, March 15, 2011 In my blog last week on Work Breakdown Structures, we reviewed the subtleties of using the [object] tag to your advantage in creating different sorts and roll up subtotals. As a followup, I’d like to drill down a bit on the initial step of using the “copy grid” exports. Each row number is unique, thus creating an identifying key for the vlookup function in Excel. Since all object X activity instances are allocated 100% to one of the three phases (with very rare exception), these row keys allow you to sort and re-group outputs while maintaining ...
Original Post Date: Thursday, March 10, 2011 My previous blog discussed a “Should Cost” methodology used by PRICE Systems to complete an analysis. In the article I included a chart depicting calibration results for manufacturing complexities for each weapon system (X-Axis). Manufacturing complexities are a major cost driver within the model. This parameter can be derived from model knowledge tables, generators or from calibration. Many times the calibrated results are simply averaged and used for predicting cost for the new system. This assumes that the new system is very similar in technology and performance as the systems used for calibration. In general this is not the ...
Original Post Date: Tuesday, March 8, 2011 How many passengers does the world's largest jetliner, the Airbus A380, hold?  Submit your estimate in the comments section!
Original Post Date: Monday, March 7, 2011 Based on your experience, does winning an opportunity in the DoD come down to how well the proposal is written? Or, are there other contributors like the content of the proposal?  The type of analysis described in the proposal?  The estimation methodology?  How well the cost realism is justified? Any insight you have would be great.
Original Post Date: Friday, March 4, 2011 I consistently run into this idea of data driven estimating.  Yet, there is no clear explanation of this concept.  I am not trying to provide one here, however, I am interested in is what is at the root of this growing movement.  My take is that it is an attempt to scratch an itch.  But what’s the itch? I believe it is related to my early post (Accuracy is Risky Business).  In the struggle to answer the accuracy question people have decided that understanding the data used in the estimating process is key ...
Original Post Date: Tuesday, March 1, 2011 The concept of the fuel cell was first published in 1938 by Christian Friedrich Schonbein.  Based on this publication Sir William Grove invented the precursor of the fuel cell in 1839. The Grove Cell created current by applying two acids to zinc and platinum electrodes separated by a porous ceramic pot.  In 1842 Grove developed the first actual fuel cell which produced electricity with hydrogen and oxygen, much like many fuel cells in use today. Fuel cells remained an intellectual curiosity until the 1960’s when the US space program identified a requirement for ...
Original Post Date: Monday, February 28, 2011 Don't reinvent the wheel.  It's a waste of time and effort.  All too often I see organizations establishing measurement programs or new software estimation intiatives and they want to build everything from the ground up.  Mistake, mistake, mistake... People have gone before you.  Learn from them.  Take their ideas and go forward from there.  In the past year, I have architected the implemention of our software cost estimation tools at two large federal agencies and two DoD programs.  Teaching people how to estimate is easy.  Teaching them how to find the data to develop estimates is ...