• Predictive Analytics for Improved Cost Management  



Blog



Original Post Date: Thursday, October 20, 2011  Check out this article.   “Why IT Projects May be Riskier than you Think”.  If you read through the comments you will see that this article truly resonates with many in the field.  In the article the authors discuss research of over 1471 IT projects (large projects with an average cost of $167 million) comparing budgets and expected performance with actual costs and results.  Their results were surprising in that the average overrun was only 27%.  Turns out that the average isn’t what requires study but rather the outliers.  The study found that ...
Original Post Date: Monday, October 3, 2011 Here’s an interesting article “Technical Debt as Metaphor for Future Cost” ().  In this the author discusses the acceptability of using the metaphor of technical debt to facilitate communications between business leaders and the software team when negotiating around the triangle  (time, money, scope).   And while the  author accepts the use of this metaphor good “short-hand” for communicating the fact that avoiding the work now is not sparing the cost but just rearranging the way the costs are incurred – and often increasing the overall costs that need to be spent.  The ...
Original Post Date: Monday, July 11, 2011 There is a ton of code out there and we’re constantly adding more.  Gartner reported in 2009 that there were more than 310 billion lines of code in use worldwide. MS Word has grown from 27,000 lines of code in the first version to about 2 million in 2010.  The Windows Operating System grew from 3 million lines of code in 1992 to 20 Million in 1998.  You get the point – there’s lots of code out there doing lots of the same things that we may want our software to do. One ...
Original Post Date: Wednesday, June 15, 2011 At the 2011 ISPA Conference, I conducted a ½ day workshop How To Develop Data-Driven Cost Estimating Relationships  in TruePlanning. The attendees at the workshop learned how to import their own data into TruePlanning and develop custom Cost Estimating Relationships. We covered three case studies:   ·         In the UCAS case study we demonstrated how we can build CERs at a higher level to provide a test of reasonableness to the CAPE. ·         In the SRDR case study we demonstrated how we develop a CER to estimate SLOC based on historical data and use the results ...
Original Post Date: Friday, June 10, 2011 I’m on my way home from the ISPA/SCEA (International Society of Parametric Analysts, Society of Cost Estimating and Analysis) Conference held in Albuquerque this week.  Attendance was very good (2nd best in the conferences history) and as the content seemed especially good this week.  I attended lots of good talks on topics ranging from SEPM (System Engineering, Project Management) cost estimating, Joint Confidence Levels, Software Estimating, Affordability,  Agile software development and estimating for Enterprise Resource Planning Systems.   Of course, just because the topics are good and well presented doesn’t mean I have ...
Original Post Date: Wednesday, May 25, 2011  Going to ISPA SCEA in New Mexico?  If so, join us for a workshop on data driven cost estimating.  Description:  Building transparency and traceability into your estimating process leads to more defendable estimates.  This hands-on workshop demonstrates how historical data is transformed into predictive models.   You will learn how your organization’s data can be synthesized into custom models that can be employed in support of third party models within a single analytical framework.  Participants will learn:  (1)   To develop system level estimating relationships to provide a test of reasonableness and historical cross-check to proposed estimates. (2) To develop ...
Original Post Date: Wednesday, March 16, 2011 ...wear the worst shoes. The cobbler was a master at his craft; he was just too tired to practice it when he got home from the shop.  Sound familiar? A disciplined approach to understanding (functional) requirements as well as analogous projects (with actuals) is our not-so-secret sauce. Why run the risk of creeping back up our career learning curve? There’s already enough scope creep to keep us busy. Plus, for you management types charged with prospecting, a consistent approach towards estimation is a great way to connect with people who've felt the pain of being the cobbler's kids. I recently reconnected ...
Original Post Date: Thursday, February 17, 2011 My June blog entry suggested the use of parametrics in real-options valuation. This month, I’d like to offer the generalized use of our type of modeling in valuing tangible assets.  Typically, fundamental analysis evaluates the intrinsic value of securities. I won’t attempt to compete with Warren Buffet here. But it is certainly the case that a company, or portfolio of securities reflecting many companies, is based in part on the market value of its product assets and their potential for future earnings, as well as other objective and subjective considerations. In parametric estimation, we take a top-down ...
Original Post Date: Monday, November 15, 2010 Last week I attended the 25th International Forum on COCOMO and Systems/Software Cost Modeling.  I attended for several reasons.  First of all, I was invited to participate on a panel whose topic was “25 years of Software Estimation: Lessons Learned, Challenges and Opportunities”.  Secondly, I have attended in the past and while it’s generally a small group, as such conferences go, I always come away impressed by the fact that so many smart people end up in one room and this year was no different.   But I digress; I really wanted to share ...
Original Post Date: Tuesday, November 2, 2010  After some recent meetings with clients I am sensing some confusion on how to estimate software reuse. I think part of the problem is in the definition of reuse, so let's start with a definition and then address the estimating issue. Software reuse is defined as “the use of existing software, or software knowledge, to build new software.” This definition came from Wikipedia. From a estimating software costs perspective the above definition is part of the problem. The definition should read: "Use of existing software with no changes for operation in the new software program.”  If the existing software is going to be changed, ...