• Predictive Analytics for Improved Cost Management  



Blog



Original Post Date: Tuesday, October 2, 2012 We’re building a tool that quickly and easily allows you to map costs in your TruePlanning(R) projects to any custom Work Breakdown Structure (WBS), including templates for the MIL-STD-881C.  By mapping TruePlanning costs to fit your point of view, you can make use of your organization’s existing research, and ensure that your cost estimates are complete, realistic, and can be easily compared to other projects apples-to-apples.  This is great for AoAs and analyzing where and why costs differ between various solutions (not to mention 881C mapping is a required deliverable for major ...
Original Post Date: Thursday, September 27, 2012 I am frequently questioned by clients and prospects about the applicability of PRICE’s parametric software estimation model to agile software development projects.  There are several ways one could respond to this.  My first thought is that if a shop is truly agile, they don’t need an estimation tool.  They know their development team velocity because agile teams are committed to measurement.  They also either know when they need to make a delivery – in which case whatever amount of software they’ve build by that point will be released.  Alternatively they may know a ...
Original Post Date: Monday, September 24, 2012 The purpose of this blog is to describe the role Value Engineering plays within the affordability process. The blog is not a step by step “How To Conduct or Execute” Value Engineering (VE) but, it is a discussion of the context, input, setup, execution hints, and output of Value Engineering in support of conducting affordability analysis and management. As such, it is important to understand the concept of affordability within the Systems Engineering paradigm. This blog is designed to provide insights, lessons learned, and suggestions for using Value Engineering in the affordability process. In ...
Original Post Date: Thursday, August 23, 2012 We’ve bought access to IHS Haystack, which links us to huge amounts of data on government parts and logistics, including electronics running the gamut of technologies, functions, and types of equipment.  The tool combines data from more than 70 military and commercial databases, and includes detailed data on costs, procurement histories, and a wide range of technical characteristics – perfect for building and validating CERs. This data will be very useful for our TP 2013 Electronics Update.  All of this data can be built into a knowledge base and made available to TruePlanning users.  With this ...
Original Post Date: Tuesday, July 3, 2012  Introduction A cost estimation model needs to be fed data, and it is only as good as data used to create it. Frequently the data needed to ‘feed’ the cost model comes from a variety of sources including engineers, subject matter experts or other individuals not directly building the cost model. Just asking these experts to stick their finger in the air and take a guess isn’t always the best approach. Using the COM enabled Data Input Form Excel solution that comes with TruePlanning can help users obtain the data needed to complete ...
Original Post Date: Monday, June 18, 2012 This week I’m attending the Better Software Conference in Vegas.   I just attended a great keynote given by Patrick Copeland of Google.  The topic was innovation.  He talked about how innovators beat ideas, prototypes beat prototypes and data beats opinions.  These sentiments are all part of the pretotyping manifesto. He started with the truth that most new products and services fail and proposed that while this is not unexpected, there is a good way to fail and a bad way to fail.  The good way to fail is to fail fast.  ...
Original Post Date: Tuesday, June 5, 2012 Ever wonder what programming languages are the most productive?  I recently did a little research into this topic using the International Software Benchmark Standards Group (ISBSG) database. The database contains over 5000 data points with size and effort data for projects from a wide variety of industries, applications and counties.  Of course not all 5000 data points were suitable for my investigation.  The software size is measured using functional size metrics but the database accepts projects that use various counting methods.  I narrowed my search to projects that used the International Function Points ...
Original Post Date: Friday, March 23, 2012 Recently I have been playing around with the International Software Benchmark Standards (ISBSG) database for Development and Enhancement projects.  And in the interest of full disclosure I admit that I am more than a little excited to have close to 6000 data points at my fingertips.  I will further admit that there’s something quite daunting about having this much data; where to start, what should I be looking for, how can I best use this data to offer some useful guidance to inform software cost estimation.  For those of you not familiar ...
Original Post Date: Thursday, February 9, 2012  Model Driven Engineering is a software development methodology focused on creating domain models that abstract the business knowledge and processes of an application domain.  Domain models allow the engineer to pursue a solution to a business problem without considering the eventual platform and implementation technology.  Model Driven Development is a paradigm within Model Driven Engineering that uses models as a primary artifact of the development process using automation to go from models to actual implementation.  Model Driver Architecture is an approach for developing software within the Model Driven Development paradigm.  It was ...
Original Post Date: Monday, December 12, 2011 Check out this paper “The Economics of Community Open Source Software Projects: An Empirical Analysis of Maintenance Effort.”  In it the authors hypothesize that Open Source practices increase the quality of the software that gets produced and subsequently lead to code that is less costly to maintain.  Low quality code must be refactored more frequently than high quality code and there is substantial evidence that maintenance interventions tend to lead to even more degradation of the quality of the code.  So not only are low quality applications more expensive to maintain, the unit ...