• Predictive Analytics for Improved Cost Management  



Blog



Original Post Date: Monday, September 24, 2012 The purpose of this blog is to describe the role Value Engineering plays within the affordability process. The blog is not a step by step “How To Conduct or Execute” Value Engineering (VE) but, it is a discussion of the context, input, setup, execution hints, and output of Value Engineering in support of conducting affordability analysis and management. As such, it is important to understand the concept of affordability within the Systems Engineering paradigm. This blog is designed to provide insights, lessons learned, and suggestions for using Value Engineering in the affordability process. In ...
Original Post Date: Tuesday, July 3, 2012  Introduction A cost estimation model needs to be fed data, and it is only as good as data used to create it. Frequently the data needed to ‘feed’ the cost model comes from a variety of sources including engineers, subject matter experts or other individuals not directly building the cost model. Just asking these experts to stick their finger in the air and take a guess isn’t always the best approach. Using the COM enabled Data Input Form Excel solution that comes with TruePlanning can help users obtain the data needed to complete ...
Original Post Date: Tuesday, July 3, 2012 Introduction @RISK and Crystal Ball are two Excel based applications that allow users to perform uncertainty, sensitivity or risk analysis on data contained in Excel spreadsheets. The analysis can be performed using various techniques including Monte Carlo and Latin Hypercube.  Many TruePlanning users are interested in performing this type of analysis on the models they create in TruePlanning or may even have requirements to perform this type of analysis on their estimates.   In response to this desire, PRICE Systems L.L.C. has created two Excel based solutions that allow users to easily leverage the ...
Original Post Date: Tuesday, June 5, 2012 Ever wonder what programming languages are the most productive?  I recently did a little research into this topic using the International Software Benchmark Standards Group (ISBSG) database. The database contains over 5000 data points with size and effort data for projects from a wide variety of industries, applications and counties.  Of course not all 5000 data points were suitable for my investigation.  The software size is measured using functional size metrics but the database accepts projects that use various counting methods.  I narrowed my search to projects that used the International Function Points ...
Original Post Date: Monday, April 2, 2012 In my previous blog, I introduced the relationship between three “breakdown structures” commonly used in project management, and how cost estimates are linked to them and aid in their usefulness.  In this blog, I’ll dig further into these relationships, and explain their impact on my Total Ownership Cost (TOC) solution in TruePlanning.   Let’s use an example of an aircraft being built for the Army.  In this hypothetical example, the prime contractor is Boeing, and they have different departments working on various parts of the aircraft.  Department 10 is responsible for the wings and ...
Original Post Date: Monday, March 26, 2012 In my research on project management and cost estimation, I often come across three different “breakdown structures” which are useful in dissecting a project from various points of view.  The work breakdown structure (WBS) is oriented around project deliverables; it breaks a system down into subsystems, components, tasks and work packages.  The organization breakdown structure (OBS) shows the structure of the organizations involved in the project, including how these organizations break down into sites, divisions, teams, etc.  Finally, the cost breakdown structure (CBS) examines a project in terms of useful cost categories, ...
Original Post Date: Friday, March 23, 2012 Recently I have been playing around with the International Software Benchmark Standards (ISBSG) database for Development and Enhancement projects.  And in the interest of full disclosure I admit that I am more than a little excited to have close to 6000 data points at my fingertips.  I will further admit that there’s something quite daunting about having this much data; where to start, what should I be looking for, how can I best use this data to offer some useful guidance to inform software cost estimation.  For those of you not familiar ...
Original Post Date: Monday, December 12, 2011 Check out this paper “The Economics of Community Open Source Software Projects: An Empirical Analysis of Maintenance Effort.”  In it the authors hypothesize that Open Source practices increase the quality of the software that gets produced and subsequently lead to code that is less costly to maintain.  Low quality code must be refactored more frequently than high quality code and there is substantial evidence that maintenance interventions tend to lead to even more degradation of the quality of the code.  So not only are low quality applications more expensive to maintain, the unit ...
Original Post Date: Monday, November 28, 2011  Check out this blog post  on project estimation.  The author discusses the practice of ‘padding’ effort estimates and how destructive this practice can be to healthy project management.  She suggests that project team members, rather than padding their individual efforts at a task level, should collaborate with project management in order to produce a good solid project plan with sufficient contingency reserves.  This allows for the project plan to reflect the most likely case but contains a safety net for those cases where the stuff that was unknown at the time of project ...
Original Post Date: Tuesday, November 8, 2011  Last week I attended the 26th annual COCOMO forum.  This meeting is an interesting combination of conference and working group and for me it’s a great place to take the pulse of the software and systems estimating community.  Lots of times you’ll go to a conference like this and feel as though the same old things are repeated year after year.  Not so with this conference – it is always a great mix of seasoned practitioners and graduate students with both groups providing forward looking information and inspiration on a variety of ...