• Predictive Analytics for Improved Cost Management  



Blog



Original Post Date: Thursday, September 27, 201 I am frequently questioned by clients and prospects about the applicability of PRICE’s parametric software estimation model to agile software development projects.  There are several ways one could respond to this.  My first thought is that if a shop is truly agile, they don’t need an estimation tool.  They know their development team velocity because agile teams are committed to measurement.  They also either know when they need to make a delivery – in which case whatever amount of software they’ve build by that point will be released.  Alternatively they may know ...
Original Post Date: Tuesday, October 2, 2012 This past year PRICE Systems has entered into a partnership with the International Benchmark Standards Group (ISBSG).  As part of this partnership we have a corporate subscription to both of their databases – the Development and Enhancement Database and the Maintenance and Support Database.  We can use these for analysis and to develop metrics that will help TruePlanning users be better software estimators.  The ISBSG is one of the oldest and most trusted sources for software project data.  They are a not for profit organization dedicated to improving software measurement at an international ...
Original Post Date: Tuesday, October 2, 2012 We’re building a tool that quickly and easily allows you to map costs in your TruePlanning(R) projects to any custom Work Breakdown Structure (WBS), including templates for the MIL-STD-881C.  By mapping TruePlanning costs to fit your point of view, you can make use of your organization’s existing research, and ensure that your cost estimates are complete, realistic, and can be easily compared to other projects apples-to-apples.  This is great for AoAs and analyzing where and why costs differ between various solutions (not to mention 881C mapping is a required deliverable for major ...
Original Post Date: Thursday, September 27, 2012 I am frequently questioned by clients and prospects about the applicability of PRICE’s parametric software estimation model to agile software development projects.  There are several ways one could respond to this.  My first thought is that if a shop is truly agile, they don’t need an estimation tool.  They know their development team velocity because agile teams are committed to measurement.  They also either know when they need to make a delivery – in which case whatever amount of software they’ve build by that point will be released.  Alternatively they may know a ...
Original Post Date: Monday, September 24, 2012 The purpose of this blog is to describe the role Value Engineering plays within the affordability process. The blog is not a step by step “How To Conduct or Execute” Value Engineering (VE) but, it is a discussion of the context, input, setup, execution hints, and output of Value Engineering in support of conducting affordability analysis and management. As such, it is important to understand the concept of affordability within the Systems Engineering paradigm. This blog is designed to provide insights, lessons learned, and suggestions for using Value Engineering in the affordability process. In ...
Original Post Date: Tuesday, July 3, 2012  Introduction A cost estimation model needs to be fed data, and it is only as good as data used to create it. Frequently the data needed to ‘feed’ the cost model comes from a variety of sources including engineers, subject matter experts or other individuals not directly building the cost model. Just asking these experts to stick their finger in the air and take a guess isn’t always the best approach. Using the COM enabled Data Input Form Excel solution that comes with TruePlanning can help users obtain the data needed to complete ...
Original Post Date: Tuesday, July 3, 2012 Introduction @RISK and Crystal Ball are two Excel based applications that allow users to perform uncertainty, sensitivity or risk analysis on data contained in Excel spreadsheets. The analysis can be performed using various techniques including Monte Carlo and Latin Hypercube.  Many TruePlanning users are interested in performing this type of analysis on the models they create in TruePlanning or may even have requirements to perform this type of analysis on their estimates.   In response to this desire, PRICE Systems L.L.C. has created two Excel based solutions that allow users to easily leverage the ...
Original Post Date: Monday, April 2, 2012 In my previous blog, I introduced the relationship between three “breakdown structures” commonly used in project management, and how cost estimates are linked to them and aid in their usefulness.  In this blog, I’ll dig further into these relationships, and explain their impact on my Total Ownership Cost (TOC) solution in TruePlanning.   Let’s use an example of an aircraft being built for the Army.  In this hypothetical example, the prime contractor is Boeing, and they have different departments working on various parts of the aircraft.  Department 10 is responsible for the wings and ...
Original Post Date: Friday, March 23, 2012 Recently I have been playing around with the International Software Benchmark Standards (ISBSG) database for Development and Enhancement projects.  And in the interest of full disclosure I admit that I am more than a little excited to have close to 6000 data points at my fingertips.  I will further admit that there’s something quite daunting about having this much data; where to start, what should I be looking for, how can I best use this data to offer some useful guidance to inform software cost estimation.  For those of you not familiar ...
Original Post Date: Monday, December 12, 2011 Check out this paper “The Economics of Community Open Source Software Projects: An Empirical Analysis of Maintenance Effort.”  In it the authors hypothesize that Open Source practices increase the quality of the software that gets produced and subsequently lead to code that is less costly to maintain.  Low quality code must be refactored more frequently than high quality code and there is substantial evidence that maintenance interventions tend to lead to even more degradation of the quality of the code.  So not only are low quality applications more expensive to maintain, the unit ...