• Predictive Analytics for Improved Cost Management  



Blog



Original Post Date: Thursday, September 27, 201 I am frequently questioned by clients and prospects about the applicability of PRICE’s parametric software estimation model to agile software development projects.  There are several ways one could respond to this.  My first thought is that if a shop is truly agile, they don’t need an estimation tool.  They know their development team velocity because agile teams are committed to measurement.  They also either know when they need to make a delivery – in which case whatever amount of software they’ve build by that point will be released.  Alternatively they may know ...
Original Post Date: Tuesday, October 2, 2012 We’re building a tool that quickly and easily allows you to map costs in your TruePlanning(R) projects to any custom Work Breakdown Structure (WBS), including templates for the MIL-STD-881C.  By mapping TruePlanning costs to fit your point of view, you can make use of your organization’s existing research, and ensure that your cost estimates are complete, realistic, and can be easily compared to other projects apples-to-apples.  This is great for AoAs and analyzing where and why costs differ between various solutions (not to mention 881C mapping is a required deliverable for major ...
Original Post Date: Thursday, September 27, 2012 I am frequently questioned by clients and prospects about the applicability of PRICE’s parametric software estimation model to agile software development projects.  There are several ways one could respond to this.  My first thought is that if a shop is truly agile, they don’t need an estimation tool.  They know their development team velocity because agile teams are committed to measurement.  They also either know when they need to make a delivery – in which case whatever amount of software they’ve build by that point will be released.  Alternatively they may know a ...
Original Post Date: Monday, September 24, 2012 The purpose of this blog is to describe the role Value Engineering plays within the affordability process. The blog is not a step by step “How To Conduct or Execute” Value Engineering (VE) but, it is a discussion of the context, input, setup, execution hints, and output of Value Engineering in support of conducting affordability analysis and management. As such, it is important to understand the concept of affordability within the Systems Engineering paradigm. This blog is designed to provide insights, lessons learned, and suggestions for using Value Engineering in the affordability process. In ...
Original Post Date: Friday, September 7, 2012 So, what goes on out here in the corn fields, anyway?  Did you know Dayton, Ohio is home to the Air Force Materiel Command?  Air Force Materiel Command (AFMC), headquartered at Wright-Patterson AFB, Ohio, “develops, acquires and sustains the aerospace power needed to defend the United States and its interests for today and tomorrow. This is accomplished through management, research, acquisition, development, testing and maintenance of existing and future weapons systems and their components” (afmc.af.mil).   In response to future budget uncertainty and a challenge by Congress to operate more efficiently, AFMC recently ...
Original Post Date: Thursday, August 23, 2012 We’ve bought access to IHS Haystack, which links us to huge amounts of data on government parts and logistics, including electronics running the gamut of technologies, functions, and types of equipment.  The tool combines data from more than 70 military and commercial databases, and includes detailed data on costs, procurement histories, and a wide range of technical characteristics – perfect for building and validating CERs. This data will be very useful for our TP 2013 Electronics Update.  All of this data can be built into a knowledge base and made available to TruePlanning users.  With this ...
Original Post Date: Tuesday, July 3, 2012 Introduction @RISK and Crystal Ball are two Excel based applications that allow users to perform uncertainty, sensitivity or risk analysis on data contained in Excel spreadsheets. The analysis can be performed using various techniques including Monte Carlo and Latin Hypercube.  Many TruePlanning users are interested in performing this type of analysis on the models they create in TruePlanning or may even have requirements to perform this type of analysis on their estimates.   In response to this desire, PRICE Systems L.L.C. has created two Excel based solutions that allow users to easily leverage the ...
Original Post Date: Monday, April 2, 2012 In my previous blog, I introduced the relationship between three “breakdown structures” commonly used in project management, and how cost estimates are linked to them and aid in their usefulness.  In this blog, I’ll dig further into these relationships, and explain their impact on my Total Ownership Cost (TOC) solution in TruePlanning.   Let’s use an example of an aircraft being built for the Army.  In this hypothetical example, the prime contractor is Boeing, and they have different departments working on various parts of the aircraft.  Department 10 is responsible for the wings and ...
Original Post Date: Monday, March 26, 2012 In my research on project management and cost estimation, I often come across three different “breakdown structures” which are useful in dissecting a project from various points of view.  The work breakdown structure (WBS) is oriented around project deliverables; it breaks a system down into subsystems, components, tasks and work packages.  The organization breakdown structure (OBS) shows the structure of the organizations involved in the project, including how these organizations break down into sites, divisions, teams, etc.  Finally, the cost breakdown structure (CBS) examines a project in terms of useful cost categories, ...
Original Post Date: Friday, March 23, 2012 Recently I have been playing around with the International Software Benchmark Standards (ISBSG) database for Development and Enhancement projects.  And in the interest of full disclosure I admit that I am more than a little excited to have close to 6000 data points at my fingertips.  I will further admit that there’s something quite daunting about having this much data; where to start, what should I be looking for, how can I best use this data to offer some useful guidance to inform software cost estimation.  For those of you not familiar ...