• Predictive Analytics for Improved Cost Management  



Blog



Original Post Date: Friday, July 9, 2010  While sitting in the operatory chair yesterday, my dentist said something that made me stop. He was complaining about an increasing rate of incompetence and apathy he observes in those delivering services to him. And while I do agree with him in principal, he and I are of the age where some folks label us as grumpy old men. So, it may not be as bad as we think. Regardless, the statement he said he made to the an unfortunate poor-quality service provider was, “If you don’t have the time to do it ...
Original Post Date: Monday, October 18, 2010 Some of us remember taking the Iowa tests during our early school days. The Iowa Tests of Basic Skills (ITBS) are standardized tests provided as a service to schools by the College of Education of The University of Iowa. The tests, administered to students in grades K-8, became a national standard for measuring scholastic aptitude – I was educated in Pennsylvania. Now out of Iowa comes another test of sorts, something called an Integrity Index Score based upon a proprietary algorithm of an organization called Iowa Live. Iowa Live calls itself, “a ...
Original Post Date: Wednesday, May 12, 2010 From my perspective as a cost researcher, the calibration tool is one of the most powerful analysis capabilities built into the TruePlanning cost management software . One way I can use this tool is to go back to an old estimate for a project that is now completed, and analyze the correctness of the previously entered input values. With this analysis, I can find ways to improve our methods of soliciting input values from the user to ensure the best values are entered the next time. This way, the TruePlanning models keep getting “smarter” as new information ...
Original Post Date: Friday, June 4, 2010 One of the great features of the TruePlanning cost management software is the fact that it makes it easy to handle complications of inflation and estimating projects performed in different countries and currencies. The costs associated with doing work in different countries, and the relative value of different currencies is constantly changing. To address this, the cost research team at PRICE does an annual economic update performed by the cost research team, and this blog will introduce some of basic concepts and research that goes into maintaining this feature every year. The price of goods and ...
Original Post Date: Thursday, June 24, 2010 "And me boss. And me boss. And me boss!" Just like Bugs Bunny tricking the mob boss into an unfair share of the loot, who doesn’t want a piece of the action. In this case the “action” is the estimate you have just finished in TruePlanning and would like to share with your coworkers. No problem, just share your project. Project sharing is a feature that is available to users who are using the Client/Server version of TruePlanning and it allows users to access projects that have been created on the centralized database by other users. If ...
Original Post Date: Monday, April 26, 2010  With so many acquisition programs over budget and behind schedule, the term “Cost Realism” is suddenly very popular. In my experience as an estimator on many major acquisition programs, two things have remained certain over years (besides death and taxes). First, the probability of the program ever achieving the original cost estimate is exactly zero and second, the more information that is known about a program, the more it will exceed its original cost estimate.    With that said, the move to Cost Realism is so important because it recognizes these two fundamental ...
Original Post Date: Friday, May 21, 2010 Last month I blogged about the importance of cost realism, its roots and how as estimators we must always reflect the truth, no matter how unpopular. This month I want to share with you a recent experience on a Source Selection. As part of the Source Selection team, my role was to conduct a Cost Realism estimate on each of the performers submitting bids. I want to share with you a few insights from that experience. One of the first rules I always follow is to never ask engineers to provide data that ...
Original Post Date: Wednesday, June 30, 2010  I recently had the opportunity to work directly for one of our clients on a high visibility, must-win proposal. The contractor was just about ready to commit to the bid number, but wanted to know the likely bids of the other two performers. We were asked to do a “Ghosting the Competition” study where we ethically collect open source data on two competing designs and combined with engineering technical data to develop a best cost estimate of the competitor’s bid positions.   Unfortunately, not much intelligence was known about the competing configurations, but the ...
Original Post Date: Tuesday, July 20, 2010 Next month (8/4 @ 12pm EST) I am presenting a webinar to discuss using TruePlanning on Source Selections. What prompted me to develop this webinar were the many recent success stories I’ve had using TruePlanning during the Source Selection process. Going a bit further, I am going to show an actual case study where TruePlanning was used to conduct an Analysis of Alternatives (AoA) exercise – along with cost/effectiveness results. We will explore a bit about the technical side of the proposed designs, develop the modeling in TruePlanning and discuss the results. In addition, we will explore ...
Original Post Date: Wednesday, September 1, 2010 I had expected to present my webinar,  “Best Practices for Cost Effectiveness Studies using TruePlanning” in early August. As you might know, I was planning to show a real world example from a recent engagement with a government customer. Unfortunately, since the Source Selection has not concluded with a downselect, I was not able to obtain the public release in time. However, for this month’s blog I will continue share some of the highlights of the webinar.   In last month’s blog we explored the uses of TruePlanning during Source Selection from the Supplier’s (or ...
Original Post Date: Tuesday, October 12, 2010 When I glanced at the Washington Post on Sunday, the following headline screamed out: Defense cuts could slow D.C. economy for years The article basically covers how Defense Secretary Robert M. Gates is calling for reducing spending on "support contractors" by 10 percent each of the next three years as the Defense budget shrinks. As Washington DC is a hub for these types of companies, the impact is expected to be significant. According to the article, more than a quarter of national defense spending contains of outlays for service contracts. Among the largest companies ...
Original Post Date: Wednesday, November 10, 2010 I was recently struck by Ash Carter’s (Under Secretary of Defense for Acquisition, Technology & Logistics) Memorandum for Acquisition Professionals, Better Buying Power: Guidance for Obtaining Greater Efficiency and Productivity in Defense Spending (14 September 2010). Within this broad sweeping memo, Ash Carter outlines 23 principal actions in five major areas aimed at increasing efficiency in Defense acquisition.  The first major area covered is “Target Affordability and Control Cost Growth”. Within this major area, program managers must treat affordability as a requirement before milestone authority is granted to proceed (starting with Milestone A). This ...
Original Post Date: Friday, December 17, 2010 In last month’s blog I wrote about Ash Carter’s (Under Secretary of Defense for Acquisition, Technology & Logistics) Memorandum for Acquisition Professionals, Better Buying Power: Guidance for Obtaining Greater Efficiency and Productivity in Defense Spending (14 September 2010). I concluded the TruePlanning unified framework and comprehensive cost models, is a tool very well suited to provide the types of analysis outlined in the memorandum. In terms of Should Cost and Independent Cost Estimates (ICE), TruePlanning estimation software provides the industry standard capability to conduct Should Cost and calibration (actual program history) for ICE. Most ...
Original Post Date: Wednesday, June 23, 2010 Parametric modeling is excellent for all aspects of early-concept cost estimation, including go/no-go decisions downstream. So, in the spirit of bringing a transparency to (ethical) financial engineering… why not apply our craft to pricing “real-options”? The latter are essentially strategic opportunities for engaging resources (cost/schedule) into projects, ventures, investments, or even abandonments. The opportunity choice has value itself!  Unlike static project Net Present Value (often, but not exclusively, approximated with Discounted Cash Flow) assuming pre-defined decisions, real-options reflect the merit of flexibility. If an R&D or proof-of-concept presents viability/ marketability learning, the option has positive value, above ...
Original Post Date: Friday, June 25, 2010  Like titanium and other exotic metal-materials, “composites” (by definition, combinations of materials) offer significant weight-savings and reduced part counts, but at a price of high production cost. Sound contrarian to our parametric cost estimating view?   Not really. Complexity of manufacture is quite higher. Likewise process index and structural tooling values grow. Plus, design lead times drive developmental cycles. That said, understand that composites represent more than a material type. They can involve a highly labor-intensive approach to preparing, braiding/ winding, molding, bonding and modular assemblage. Yes, some aspects of braiding and molding lend themselves to automation—which then drives tooling ...
Original Post Date: Thursday, August 12, 2010 The late Norm Crosby’s “Quality is Free” taught us that an investment into quality is more than offset by prevention of defects based upon understanding of requirements. Only with the latter can lack of conformance (and subsequent costs) be captured and hence quality quantified. So how then is Parametrics relevant?  Parametric estimating is more than cost modeling. Our craft represents an initial consulting function into the accuracy and completeness of program planning concepts. Our customers trust us to know when to ask and when to supplement. Yes, we are mathematical and financial modelers too. But I’d suggest that “Parametrics is ...
Original Post Date: Tuesday, August 24, 2010 Over the past several weeks several users have inquired about the best way to estimate costs associated with porting existing software to a new hardware environment. Normally for this situation some of the existing software will require some amount of adaptation to operate on a new server. However, a large portion of the existing software will only require integration into the new environment.   Estimating software costs associated with the above will require the use of several cost objects: - Systems cost object if program management, Quality Assurance, configuration, and    documentation costs are to be included in ...
Original Post Date: Tuesday, April 20, 2010 In September of 2009 the United States Government Accountability Office (GAO) submitted a report[1] discussing the lack of robust Analysis of Alternatives for weapons systems. The report indicated that … “Cost, schedule, and performance problems in the Department of Defense’s (DOD) weapon system programs are serious. Why is it that DoD weapons programs experience a simultaneous cost growth and performance degradation? I believe the answer is found in unrealistic cost estimates and schedule estimates mostly driven by pressure to win a program within a certain budget constraint. Excessive requirements change either through poor ...
Original Post Date: Monday, June 7, 2010 Currently we are exploring the best approach to including a more comprehensive cost estimate for Total Ownership Costs (TOC) into TruePlanning. The current version of the software has focused on development and production costs with some life cycle costing including. The life cycle costs included are focused on the system specific O&S costs such as initial spares for priming the supply pipeline, maintenance, replenishment spares, etc. It is a system view as opposed to a program view of TOC. As we better understand the need to conduct affordability studies it has become clear that design decisions ...
Original Post Date: Monday, September 20, 2010 I have been fortunate in my career to have been associated with some great mentors. Each individual has provided me a bit of a golden nugget to carry with me as I tried to navigate my way through the professional waters. My first “civilian” manager, after I left the service and joined industry, provided me a list of the Laws of Analysis (I had just started a position as an operations research analyst). He explained that this list was a mix of serious and tongue in cheek snippets of wisdom. I looked at ...
Original Post Date: Wednesday, September 29, 2010 In May of this year the Washington Post published an editorial article on the need to reduce waste in the Defense Department. The byline of the article was “Defense Secretary Gates’s war of necessity against wasteful spending.” In this article the writer points out that the secretary is taking on the challenge of maintaining our military force [at reasonable level of effectiveness] during a time in which the President and Congress are seeking cost savings / reductions based on the decrease in our presence in Iraq.  Mr. Gates goal is to look for efficiencies ...
Original Post Date: Wednesday, September 25, 2013 We all pull data and research from various sources when creating a project estimate.  You may pull together public CERS, internal research, subscription based data or commercial models.  In the end you want your entire estimate in one format.  If you use TruePlanning, you may have used the “Other Direct Cost Object” in the past to include costs estimated in another model.  You may have utilized the “Equation Cost Object” to include a CER with up to 5 variables, which would allow you to account for the size and complexity of an ...
Original Post Date: Thursday, September 27, 201 I am frequently questioned by clients and prospects about the applicability of PRICE’s parametric software estimation model to agile software development projects.  There are several ways one could respond to this.  My first thought is that if a shop is truly agile, they don’t need an estimation tool.  They know their development team velocity because agile teams are committed to measurement.  They also either know when they need to make a delivery – in which case whatever amount of software they’ve build by that point will be released.  Alternatively they may know ...
Original Post Date: Wednesday, February 23, 2011 In the world of estimating, accuracy is the first question out of people’s mouths.  Above all else they want to know the accuracy of an estimate.  How accurate is that approximate judgment?  Craziness! True accuracy can only be determined after the project or effort has been completed and a post-audit analysis reconciles what was expected to happen with what did happen.  This is a very expensive, time consuming process that many preach about but few actually attempt.  In my experience, when people ask about accuracy what they are really interested in is ...
Original Post Date: Tuesday, February 22, 2011 In the February 2011 issue of National Defense, I was struck by the article “Uncertain Path Ahead for Military Truck Fleet”[1]. This article centered on the best strategies for modernization of the aging fleet of Humvees. The recapitalization of 150,000 Army and 25,000 Marine Corps Humvees is creating a “fix or buy new” dilemna for decision makers. According to the article, GAO analyst Michael J. Sullivan should include a “cost-benefit analysis that would minimize the collective acquisition and support costs of the various truck programs, and reduce the risk of overlap or ...
Original Post Date: Monday, January 17, 2011 While I don’t like to admit to visiting a website entitled geekArticles.com, I did stumble across a reprint of an essay by Grant Rule “Bees and the Art of Estimating”  that some of you may find interesting and instructive.  The author participates in his own form of “Estimation Trivia” by posing the following challenge “Take paper and pencil and write your estimate for the number of insects in the average hive of English honeybees.”  Of the approximately 1100 software measurement and process improvement professionals he has challenged thusly,  only about 10 have ...
Original Post Date: Wednesday, January 12, 2011  TruePlanning 2010 SR1 estimation software is now available as an upgrade for existing PRICE customers. The most significant update to this version of TruePlanning is the capability to use both parametric estimating models as well as analogous data to produce estimates. This capability validates and increases the defensibility of estimates. TruePlanning provides a framework that allows content driven parametric models to be estimated in one system. Most notably, hardware, software, IT and Systems of Systems (SoS). No other commercially available estimating tool can make that claim. However, whereas in previous versions estimates relied on ...
Original Post Date: Tuesday, January 11, 2011 Last week I gave a webinar which detailed the PRICE perspective on Should Cost & Will Cost Management. The responses I have received have been very positive and also informing. For those of you who could not attend you can view the recorded version of that webinar here. Below is a brief summation of that presentation and some key takeaways. The Under Secretary of Defense issued a memo late last year. The thrust of the memo was the current need for greater efficiency and productivity in defense spending. His guidance contained 23 principal actions for improving the ...
Original Post Date: Thursday, January 6, 2011 Today, PRICE Systems, Senior Research Analyst, Bob Koury, will be presenting on Will Cost/Should Cost management. The presentation will focus on two main requirements mandated in the Ash Carter memo (mentioned here several times): Developing Should Cost/Will Cost targets and establishing Affordability as a requirement.  An example will be provided of how parametric estimating models were used to establish “Should Cost” targets and how they can be used by a budget authority (government or Industry) to be an informed consumer of contractor or sub-contractor bids. The demonstration portion of this webinar will focus on ...
Original Post Date: Friday, December 3, 2010 Yesterday I had the pleasure of speaking at the New England SCEA Chapter December. The attendees were a great mix of experienced, seasoned cost estimators and young, new talent, eager to learn techniques to apply on the job.  My topic was the program management value of combining estimating Rules of Thumb with more rigorous cost estimating models and databases [link to presentation .pdf].  Rule of Thumb estimating is used every day by program managers to help guide their projects.  Oversight authorities rarely have the resources to perform detailed program estimates, so they ...
Original Post Date: Wednesday, November 17, 2010 Recently a cost research project on missiles was completed. The research resulted in performance based equations for air-to-ground and surface-to-air missiles were developed. The performance based equations can be used for early concept estimation  on missile development and production costs. The question though is “What is the process for developing this type of estimating relationship?” This will be the first of a series of BLOGs on this topic.  The first task is to define what is a “Performance Based Equation?” Bruce Fad covered this definition in a previous “Data Driven BLOG” so please review his post for the details. The second step ...
Original Post Date: Monday, November 15, 2010 Last week I attended the 25th International Forum on COCOMO and Systems/Software Cost Modeling.  I attended for several reasons.  First of all, I was invited to participate on a panel whose topic was “25 years of Software Estimation: Lessons Learned, Challenges and Opportunities”.  Secondly, I have attended in the past and while it’s generally a small group, as such conferences go, I always come away impressed by the fact that so many smart people end up in one room and this year was no different.   But I digress; I really wanted to share ...
Original Post Date: Tuesday, November 2, 2010  After some recent meetings with clients I am sensing some confusion on how to estimate software reuse. I think part of the problem is in the definition of reuse, so let's start with a definition and then address the estimating issue. Software reuse is defined as “the use of existing software, or software knowledge, to build new software.” This definition came from Wikipedia. From a estimating software costs perspective the above definition is part of the problem. The definition should read: "Use of existing software with no changes for operation in the new software program.”  If the existing software is going to be changed, ...
Original Post Date: Wednesday, October 27, 2010 Here’s a cool project.  The Bloodhound Project  is focused on building a land vehicle capable of breaking the 1000mph speed barrier.  The mission of the project is twofold.  The first is to “overcome the impossible using science, technology, engineering and mathematics”.  But the second is more interesting – this project is intended as motivation for the upcoming generation to embrace technology related fields.  Britain doesn’t have enough students interested in such fields and they are worried about their ability to compete in technological forays going forward. But how much should something like this ...
Original Post Date: Thursday, September 23, 2010 You need 3 things for your software estimates to be successful. And I will add a fourth one in after I talk about the first 3. 1. You need qualified and experienced people to generate the estimates. They have to know how to estimate and they have to understand what the problem is that the project is going to solve…..at least well enough to estimate it. This can be one person or many depending on the difficulty of the business area. The harder it is, the better having more brains look at the problem. But not to the point ...
Original Post Date: Tuesday, October 12, 2010 National Boss Day is quickly approaching! While October 16th is the actual day this year it will be observed on Oct 15th since the 16th falls on a Saturday and what boss wants to hear from his or her employees on a day off even to be showered with cards, flowers and accolades.  According to Barry Wood, Boss Day was started in 1958 when Patricia Bays Haroski of Deerfield Ill registered it as a special date with the US Chamber of Congress to honor her boss (who was also her father).  October ...
Original Post Date: Wednesday, October 6, 2010 Over the past several weeks several users have inquired about the best way to model legacy software that is being modified when estimating software costs. The software component within the TruePlanning Software model has an input parameter call “adapted Code Size.” This input parameter accounts for existing or legacy software that will be modified or changed to meet a new requirement. Tied with the size input parameter is Percent Design/Code/Test adapted. Although the model will calculate a percentage for each input, I would recommend that user’s analyze the calculate values and override the calculation where required. The percentage ...
Original Post Date: Wednesday, October 6, 2010 The key cost driver when estimating software costs is the size of the product. The problem is that there is no perfect technique available to measure and quantify the size of software. The two major techniques in use today are Source Lines of Code and Function Points.  Today we will talk about Source Lines or Code or SLOC. Source Lines of Code measures logical lines of code. It takes some of the uncertainty out of physical line of code measures by counting only complete statements (which can cross over more than on physical line). SLOC excludes comments and blank ...
Original Post Date: Wednesday, September 29, 2010 Earlier this month Under Secretary of Defense, Asthon Carter, spoke at the 2010 Annual Air and Space Conference. His speech touched on some of the 5 categories that he and Defense Secretary Robert Gates laid out in order to identify low value activities and reapportion approximately $100 billion dollars within the Defense Budget to higher value capabilities needed to support US Forces. The first of those categories he described has to do with "targeting affordability".  In the context of a specific Navy program he explained this concept in a simple practical manner:  "The way to do that is ...
Original Post Date: Wednesday, September 29, 2010 Recently I came across the word “off-label”.  It is the term used by the medical community when a drug is used to treat a condition for which it has not been approved by the Food and Drug Administration.   We sometimes use TruePlanning for “off-label” purposes. A good example would be using the TruePlanning Calibration tool to answer such questions as, what is the maximum number of source lines of code (SLOC) I can get and remain within my budget?  I call this TruePlanning Optimization. Here is an example answering the SLOC question. First begin ...
Original Post Date: Wednesday, September 1, 2010 Because I have enrolled in several on-line fiction writing workshops, I regularly receive newsletters about upcoming events in the world of fiction writing.  Several weeks ago I was quite intrigued when I received an invitation to enter a ‘Hint Fiction’ writing contest.  Here I don’t even know what hint fiction is and someone thinks I might be good enough at it to enter a contest – who knew?    Naturally, I Googled hint fiction (how did we get by without Google?) and found out that it is  “a story of 25 words or ...
Original Post Date: Thursday, August 26, 2010 I  recently moved and thought about the need to do a new household budget. This got me to thinking of the budgeting capability in TruePlanning. First, you can use TruePlanning to determine a budget. The time phased output, either monthly or annually, is ideal for establishing a budget and budget profile for your project.  TruePlanning also splits the time phased costs into development, production, and operating and support categories. Be cautious, however, in using the Phase report. The System cost object costs are assigned by schedule duration, which may not necessarily reflect the actual project cost flow. A better choice may ...
Original Post Date: Friday, August 13, 2010 If you want to read an interesting article on EVM – check out ‘The Three Deadly Sins of EVM’  by Mike Mullaly.  In it he reflects some of my personal feelings about EVM but he does this much more eloquently than ‘it’s a crock’.  OK – while I have actually said that out loud – it’s probably a little too strong.  I do think that EVM may be a good tool to have in the toolbox – it’s just not the project panacea that so many make it out to be.  And it ...
Original Post Date: Friday, July 30, 2010 Earlier this week I presented a webinar on the topic of SOA governance – specifically focused on making sure that organizations include SOA governance as they plan to deploy SOA capabilities.  As sometimes happens when I am giving a presentation (especially one I have given before), I was struck with somewhat of an epiphany as I was relaying the material on my slides.  In this case it was not really a new idea about the material, but more a deeper understanding of why this topic really is important. To be honest, when I first ...
Original Post Date: Thursday, July 29, 2010 The following is an extract from a paper written in 1978 from one of the founders of PRICE Systems:  Two questions are often asked by those unfamiliar with TruePlanning’s approach to cost modeling: What is your CER (cost estimating relationship)? And what is your data base? These questions are closely related.  Both are based on the assumption that the PRICE modeling approach is the same as that customarily used in developing cost estimating relationships.  This is not the case.  The customary approach is to first gather as much relevant data as possible, then screen the ...
Original Post Date: Thursday, July 22, 2010 Recently the Director of the Office of Management & Budget (OMB), Peter Orszag issued a directive that was posted on the OMB blog that outlined three specific actions for IT reform. The actions include a freeze on all new IT modernization task orders for financial systems, reviews of current high risk IT projects and require agencies to submit improvement plans to the CIO; thirdly, the OMB Deputy Director will develop recommendations within 120 days to improve the federal government’s overall IT procurement and management practices. Orszag states: “While a productivity boom has transformed private sector ...
Original Post Date: Thursday, July 8, 2010 Which came first the chicken or the egg?  We can look to Darwin for one theory, the Bible for another but at the end of the day – nobody really knows.  There can be no chicken without an egg, nor there be an egg with no chicken.  Thus we are left with a bit of a circuitous conundrum. Joint Confidence Level (JCL), NASA’s current best practice for program planning and management, also presents a circuitous conundrum.  When a program has a JCL of 70% this implies that there is a 70% confidence that ...
Original Post Date: Tuesday, July 6, 2010  Those words describe the triangular probability distribution used by FRISK in performing risk analysis in TruePlanning. FRISK requires two major assumptions on the part of the user. The first is that the combination or convolution of a number of triangular distributions results in a log normal distribution. The second is that there is correlation between cost objects.  The triangular distribution is completely defined by three simple inputs: an optimistic value, a pessimistic value, and a most likely value. By eliciting information from engineers, I have found that they are much more willing to commit to a range ...
Original Post Date: Thursday, June 10, 2010 This week I was thinking how useful the Export Import feature in TruePlanning can be. First, the Excel Import spreadsheet gives you an easy and convenient way to gather your data. In addition, it gives you an easy and convenient way to check and validate the data. When observing the data in a column format, it is so easy to spot and correct anomalies. Second, the Excel Import spreadsheet gives you an easy and convenient way to build your Product Breakdown Structure. No more inserting one cost object at a time. The Excel Import feature does it all ...
Original Post Date: Tuesday, June 8, 2010  Now that we know the background on the original concept of TRL's (Technology Readiness Levels - reference Arlene Minkiewicz's earlier blog post), we now want to address estimating costs associated with different TRL levels. It is important to realize that a model cannot estimate TRL costs by simply changing an input parameter. Rather the only way to estimate costs associated with different TRL levels is to model the scenario. For example, if you are estimating costs for TRL level 2 phase the input parameters would be very different than if estimating costs for TRL level ...