• Predictive Analytics for Improved Cost Management  



Blog



Original Post Date: Tuesday, June 19, 2012 In my last blog, I talked about the major research study on electronics being undertaken by the PRICE cost research team this year.  So far, we have visited modern electronics facilities, interviewed electronics experts, and visited customer sites to discuss their electronics estimating challenges. Among other things, we want to revisit and improve our “Manufacturing Complexity for Electronics” calculator.  This calculator guides users through a process of describing the electronic components being modeled, in a way that helps them quantify the complexity.  The first steps involve describing the equipment type, and technologies used.  ...
Original Post Date: Monday, December 12, 2011 Check out this paper “The Economics of Community Open Source Software Projects: An Empirical Analysis of Maintenance Effort.”  In it the authors hypothesize that Open Source practices increase the quality of the software that gets produced and subsequently lead to code that is less costly to maintain.  Low quality code must be refactored more frequently than high quality code and there is substantial evidence that maintenance interventions tend to lead to even more degradation of the quality of the code.  So not only are low quality applications more expensive to maintain, the unit ...
Original Post Date: Friday, September 2, 2011  The IEEE published “Top 11 Technologies of the Decade” in the  January 2011 editions of the IEEE Spectrum magazine.  It should come to a surprise to no one that the Smartphone was number 1 on this list.  The answer to the author’s question “Is your phone smarter than a fifth grader” was a resounding YES![1]   In 1983 Motorola introduced the first hand hell cellular phone.  It weighed in at two and a half pounds, had memory capacity for 30 phone numbers, took 10 hours to recharge and had a selling price of $4000 ...
Original Post Date: Friday, June 17, 2011  Building transparency and traceability into your estimating process leads to more defendable estimates and we can help you do that. We will demonstrate how historical data is transformed into predictive models.   You will learn how your data can be synthesized into custom models that can be employed in support of third party models within a single analytical framework. Learn more at our webinar on June 29th @ 11am Eastern.  Reserve your no-charge; no obligation webinar seat now at: https://www2.gotomeeting.com/register/372682434
Original Post Date: Wednesday, June 15, 2011 While teaching an introductory TruePlanning for Software Estimating course this week at an Army location, I was asked to follow up with a clarification on “percent adapted” calculation.  The official PRICE training materials definitions are:   • Percent of Design Adapted - the percentage of the existing (adapted code) design that must change to enable the adapted code to function and meet the software project requirements;   • Percent of Code Adapted - the percentage of the adapted code that must change to enable the adapted code to function and meet the software project requirements.   The former, Design, ...
Original Post Date: Wednesday, May 25, 2011  Going to ISPA SCEA in New Mexico?  If so, join us for a workshop on data driven cost estimating.  Description:  Building transparency and traceability into your estimating process leads to more defendable estimates.  This hands-on workshop demonstrates how historical data is transformed into predictive models.   You will learn how your organization’s data can be synthesized into custom models that can be employed in support of third party models within a single analytical framework.  Participants will learn:  (1)   To develop system level estimating relationships to provide a test of reasonableness and historical cross-check to proposed estimates. (2) To develop ...
Original Post Date: Thursday, May 19, 2011 I was recently asked by a client to provide a synopsis of what TruePlanning offers in response to the Ashton Carter Memorandum – Implementation of Will-Cost and Should-Cost Management. In the memo, the Undersecretary of Defense AT&L listed “Selected Ingredients of Should Cost Management”. It was interesting to note how much capability is provided by TruePlanning to effectively support efficient should cost management. In this month’s blog, I will share with my response to our client with you. ...
Original Post Date: Wednesday, May 18, 2011  Parametrics is more than estimating. It represents the complete process of capturing and utilizing (often with calibration) non-cost drivers, as well as associated programattics and configuration levels. The Wiki definition of systems engineering immediately speaks to project complexity, life cycle management, and logistics. Any question that parametrics and systems engineering are interrelated?  In many of our customer organizations, affordability and cost-benefit analyses have migrated to system engineering functions. How and where does your organization perform these analyses?  As we enhance our capabilities and applications, it’s beneficial for all concerned to understand your adaptation of parametrics within the core ...
Original Post Date: Tuesday, March 29, 2011 “I think we have an obligation to work with industry to ensure that our suppliers do not just remain world class in defence, but aspire to be world-class manufactures that can withstand comparison to other industries.” Chief of Defence Procurement, Sir Robert Walmsley Is this a practical proposition or is it a pipe dream?  The following excerpt from Dale Shermon’s Systems Cost Engineering attempts to make the case that this type of comparison is possible. Many of the statements in proposals and marketing literature stating the superiority of a company are anecdotal or at best qualitative ...
Original Post Date: Wednesday, March 16, 2011 In Parametrics is Free, I acknowledged receiving (too late) “you should’ve known to ask that” over the years. Quality control after-the-fact is fine; but it’s better and cheaper to take a systematic approach to quality assurance as part of your estimating process. The sheer volume of what we model can often keep us so close to the details that we are unable to step back and put on our QA hat on for a sanity check. Enter Quality! On a very large project, our team has introduced a few regular cross-checks, notwithstanding typical math check-sums.   A round table peer ...