• Predictive Analytics for Improved Cost Management  



Blog



We have all likely heard of “should cost” estimates.  Boiled down: If everything goes as planned, how much should program cost?  Are there efficiencies or best practices we can employ to get the job done faster and cheaper?  We may use analogies, parametrics, bottom up estimates, or extrapolation of actual costs to make a determination.  How many of us have heard of “should price”?  “Should price” is the determination of reasonableness of price for commercial items.  When contracting personnel receive quotes for off-the-shelf items, they need tools at their disposal to determine if the item is fairly priced.  On ...
Original Post Date: Tuesday, July 20, 2010 Next month (8/4 @ 12pm EST) I am presenting a webinar to discuss using TruePlanning on Source Selections. What prompted me to develop this webinar were the many recent success stories I’ve had using TruePlanning during the Source Selection process. Going a bit further, I am going to show an actual case study where TruePlanning was used to conduct an Analysis of Alternatives (AoA) exercise – along with cost/effectiveness results. We will explore a bit about the technical side of the proposed designs, develop the modeling in TruePlanning and discuss the results. In addition, we will explore ...
Original Post Date: Wednesday, November 10, 2010 I was recently struck by Ash Carter’s (Under Secretary of Defense for Acquisition, Technology & Logistics) Memorandum for Acquisition Professionals, Better Buying Power: Guidance for Obtaining Greater Efficiency and Productivity in Defense Spending (14 September 2010). Within this broad sweeping memo, Ash Carter outlines 23 principal actions in five major areas aimed at increasing efficiency in Defense acquisition.  The first major area covered is “Target Affordability and Control Cost Growth”. Within this major area, program managers must treat affordability as a requirement before milestone authority is granted to proceed (starting with Milestone A). This ...
Original Post Date: Friday, December 17, 2010 In last month’s blog I wrote about Ash Carter’s (Under Secretary of Defense for Acquisition, Technology & Logistics) Memorandum for Acquisition Professionals, Better Buying Power: Guidance for Obtaining Greater Efficiency and Productivity in Defense Spending (14 September 2010). I concluded the TruePlanning unified framework and comprehensive cost models, is a tool very well suited to provide the types of analysis outlined in the memorandum. In terms of Should Cost and Independent Cost Estimates (ICE), TruePlanning estimation software provides the industry standard capability to conduct Should Cost and calibration (actual program history) for ICE. Most ...
Original Post Date: Wednesday, June 23, 2010 Parametric modeling is excellent for all aspects of early-concept cost estimation, including go/no-go decisions downstream. So, in the spirit of bringing a transparency to (ethical) financial engineering… why not apply our craft to pricing “real-options”? The latter are essentially strategic opportunities for engaging resources (cost/schedule) into projects, ventures, investments, or even abandonments. The opportunity choice has value itself!  Unlike static project Net Present Value (often, but not exclusively, approximated with Discounted Cash Flow) assuming pre-defined decisions, real-options reflect the merit of flexibility. If an R&D or proof-of-concept presents viability/ marketability learning, the option has positive value, above ...
Original Post Date: Thursday, October 7, 2010 Ahhhh, the 80s… a challenging (but often confusing) time in an evolving computing world.  Working in 1985 as a software estimator as well as SQA engineer in a quality assurance department that “audited” real-time projects using new concepts like OOD & OOP… well, you get the picture.  It was a great time to get immersed into great work.  And the good news:  that company’s process as well as its developers were bullish on a young estimation/ quality types asking plenty of questions… as long as they were of the Yes-No variety.  And ...
Original Post Date: Thursday, August 12, 2010 The late Norm Crosby’s “Quality is Free” taught us that an investment into quality is more than offset by prevention of defects based upon understanding of requirements. Only with the latter can lack of conformance (and subsequent costs) be captured and hence quality quantified. So how then is Parametrics relevant?  Parametric estimating is more than cost modeling. Our craft represents an initial consulting function into the accuracy and completeness of program planning concepts. Our customers trust us to know when to ask and when to supplement. Yes, we are mathematical and financial modelers too. But I’d suggest that “Parametrics is ...
Original Post Date: Tuesday, August 24, 2010 Over the past several weeks several users have inquired about the best way to estimate costs associated with porting existing software to a new hardware environment. Normally for this situation some of the existing software will require some amount of adaptation to operate on a new server. However, a large portion of the existing software will only require integration into the new environment.   Estimating software costs associated with the above will require the use of several cost objects: - Systems cost object if program management, Quality Assurance, configuration, and    documentation costs are to be included in ...
Original Post Date: Monday, September 20, 2010 I have been fortunate in my career to have been associated with some great mentors. Each individual has provided me a bit of a golden nugget to carry with me as I tried to navigate my way through the professional waters. My first “civilian” manager, after I left the service and joined industry, provided me a list of the Laws of Analysis (I had just started a position as an operations research analyst). He explained that this list was a mix of serious and tongue in cheek snippets of wisdom. I looked at ...
Original Post Date: Monday, November 8, 2010  Margaret Wolfe Hungerford in 1878 in her book Molly Bawn coined the phrase …”Beauty is in the eye of the beholder”. This concept of the “value of beauty” has been expressed by others such as:    Benjamin Frankin in Poor Richards Almanack 1741 when he wrote;                 “Beauty, like supreme dominion                 Is but supported by opinion” David Hume in Moral and Political 1742 “Beauty in things exists merely in the mind which contemplates them.”   So what does this have to do with Cost Benefit? Well Merriam-Webster dictionary defines benefit as something that provides useful aid. Inherent in the term “useful” is ...
Original Post Date: Tuesday, July 1, 2014 Whether you’re doing a software cost estimate to support a Bid and Proposal effort, a software valuation, should cost analysis, or to develop a detailed project plan, it is vitally important to understand the ‘size’ of the software you are estimating.  The problem with software size is that it tends to fall into the intangible realm of reality.  If you tell me you are building a widget that weighs 13 pounds, I can really start to get my head around the task at hand.  If I’m chatting about this with my European colleagues, ...
Original Post Date: Friday, June 20, 2014 Proposal estimates based on grassroots engineering judgment are necessary to achieve company buy-in, but often are not convincing or not in sync with the price-to-win.  This contention can be resolved through by comparing the grassroots estimate to an estimate developed using data driven parametric techniques.  Parametric estimates apply statistical relationships to project data to determine likely costs for a project.  Of course, for a parametric model to properly support this cross check of the grassroots estimate, the proper data must be fed into the model.  This most likely requires the estimator to reach ...
Original Post Date: Wednesday, April 2, 2014 Introduction Parametric estimates provide reliable, reproducible, and flexible views into cost and effort so it’s only natural to want to include this data in a bid and proposal workflow. With TruePlanning 2014 big steps have been taken to make such integration seamless and easily reproducible.  New tools in the TruePlanning suite of products, as well as, integrations with some of the major bid and proposal software applications are at the heart of this new feature set. You can learn more about TruePlanning 2014 and the PRICE cost estimation models at our website, but let's ...
Original Post Date: Thursday, March 20, 2014 One of the complications in generating Bids and Proposals for Modules and Microcircuits is determining the “Should Cost” for better cost realism. Most of the electronic modules and their components in the proposals are not actually manufactured by the Proposer, but rather by a subcontractor, thus becoming a Purchased item. It is difficult to determine the cost of making the Module, and determining a fair cost. Costs for the modules include Assembly and Test costs together with the component costs. Components such as ASIC’s (Application Specific Integrated Circuits), have both the cost of developing the devices and ...
Original Post Date: Monday, December 30, 2013 Unless you live under a rock, you are aware of the healthcare.gov rollout disaster.  While similar IT failures are regularly in the news, the high profile of healthcare.gov has really mainstreamed awareness of the fragility of many IT projects.  Check out this article entitled ‘The Worst IT project disasters of 2013’.  It details IT project failures such as:  IBM’s failure to deliver on a payroll system project that could potentially cost taxpayers up to $1.1 Billion dollars US.    SAP’s failure to deliver satisfactorily on requirements for ...
Original Post Date: Friday, October 4, 2013 The late Norm Crosby’s “Quality is Free” taught us that an investment into quality is more than offset by prevention of defects based upon understanding of requirements. Only with the latter can lack of conformance (and subsequent costs) be captured and hence quantified towards quality. So how then is Parametrics relevant? Parametric estimating is more than cost modeling. Our craft represents an initial consulting function into the accuracy and completeness of program planning concepts. Our customers trust us to know when to ask and when to supplement. Yes, we are mathematical and financial modelers ...
Original Post Date: Friday, October 4, 2013 In Parametrics is Free, I acknowledged receiving (too late) “you should’ve known to ask that” over the years. Quality control after-the-fact is fine; but it’s better and cheaper to take a systematic approach to quality assurance as part of your estimating process. The sheer volume of what we model can often keep us so close to the details that we are unable to step back and put on our QA hat on for a sanity check. Enter Quality! On a very large project, our team has introduced a few regular cross-checks, notwithstanding typical ...
Original Post Date: Friday, October 4, 2013 Parametric modeling is excellent for all aspects of early-concept cost estimation, including go/no-go decisions downstream. So, in the spirit of bringing a transparency to (ethical) financial engineering… why not apply our craft to pricing “real-options”? The latter are essentially strategic opportunities for engaging resources (cost/schedule) into projects, ventures, investments, or even abandonments. The opportunity choice has value itself! Unlike static project Net Present Value (often, but not exclusively, approximated with Discounted Cash Flow) assuming pre-defined decisions, real-options reflect the merit of flexibility. If an R&D or proof-of-concept presents viability/marketability learning, the option has positive ...
Original Post Date: Friday, October 4, 2013 Ahhhh, the 80s… a challenging (but often confusing) time in an evolving computing world. Working in 1985 as a software estimator as well as SQA engineer in a quality assurance department that “audited” real-time projects using new concepts like OOD & OOP… well, you get the picture. It was a great time to get immersed into great work. And the good news: that company’s process as well as its developers were bullish on a young estimation/ quality type asking plenty of questions… as long as they were of the Yes-No variety. And ask ...
Original Post Date: Friday, October 4, 201 My "Real Options Valuation" blog suggested the use of parametrics in real options valuation. I’d like to offer the generalized use of our type of modeling in valuing tangible assets. Typically, fundamental analysis evaluates the intrinsic value of securities. I won’t attempt to compete with Warren Buffet here. But it is certainly the case that a company, or portfolio of securities reflecting many companies, is based in part on the market value of its product assets and their potential for future earnings, as well as other objective and subjective considerations. In parametric estimation, ...
Original Post Date: Wednesday, September 25, 2013 The “Systems Folder” cost object which is found at the start of every TruePlanning Project is most often confused with the “Folder” icon. These two however should not be confused. The “Folder” icon does not have an input sheet at all. It is not a cost object and contains no cost estimating logic or relationships.  It is provided as a collection point so that cost objects can be grouped in ways for clarity like to separate out phases of the acquisition lifecycle or to divide costs between subcontractors, etc.  Whereas, the “System Folder” contains all ...
Original Post Date: Wednesday, September 25, 2013 We may all agree that risk analysis is a necessary, vital part of any valid/defensible cost estimate.  We may not agree as much on the best approach to take to quantify risk in an estimate.  All estimates contain risk.  In the words of a wise cost estimator I know, “That’s why they’re called estimates, and not exactimates!”  We must quantify and manage levels of risk.  Why?  One vital part of a successful program is the ability to build a budget based on reliable cost projections.  Reliability increases when we can analyze inherent risk, ...
Original Post Date: Wednesday, September 25, 2013 In Government contracting all contracts are made up of a network of suppliers. The Prime contractor who won the overall bid usually has a supply chain of vendors from whom they receive their products and services. In addition they have Subcontractors who provide services under a contracted agreement of work. These vendors and subcontractors most likely have their own network of suppliers which allows for a cost-effective supply chain that extends across America and to other nations. Vendors sell identical or similar products to different customers as part of their regular operations. These ...
Original Post Date: Wednesday, September 25, 2013 Introduction/Problem Statement A current client has expressed interest in the default values on the simple vs. the detailed input sheet. More specifically the question arose because this particular customer as well as others had a misconception about the simple vs. the detailed input sheet default values. Most users did not realize that if they were only inputting values on the simple input sheet that the detailed input sheet default values were still be used in the calculation for their cost estimate. So the question became how much are each of these default value inputs ...
Original Post Date: Wednesday, September 25, 2013 These days bidding can be a game, and contractor leadership is constantly making decisions on whether to take on risk in order to stay competitive or to bid conservatively for the safety of not overrunning.  You may complete a cost model for a program, and spend time analyzing the uncertainties behind each input and in the end find that your estimate lands at the 30% confidence level.  After some strategic analysis, the bid leadership team decides, we would like to bid at the 80% Confidence level, “please present your estimate to support that total”.  ...
Original Post Date: Wednesday, September 25, 2013 “On 29 July 2003, the Acting Under Secretary of Defense (Acquisition, Technology and Logistics) signed a policy memorandum entitled “Policy for Unique Identification (UID) of Tangible Items – New Equipment, Major Modifications, and Reprocurements of Equipment and Spares”. This Policy made UID a mandatory DoD requirement on all new equipment and materiel delivered pursuant to solicitations issued on or after January 1, 2004. USD(AT&L) issued verbal guidance that tangible assets manufactured by DoD’s organic depots were to be considered “new” items which fall under UID marking policy, beginning 1 January, 2005. An item is considered “significant”, and will be uniquely ...
Original Post Date: Wednesday, September 25, 2013 I’ve recently had a number of users ask, “How do I model life cycle costs for a missile that just sits on a shelf?”  I had never actually tried to model this, but of course I know it’s possible.  So I turned to some of my fellow PRICE experts, and found that of course this is not the first time anyone has ever tried to model this kind of thing… Many ordnance weapons such as mortar shells, torpedoes, bombs, missiles and various projectiles are stockpiled until they are actually needed. These weapons ...
Original Post Date: Tuesday, September 24, 201 Risk Analysis Methodology Overview – Method of Moments In this second of three articles on risk analysis, we will discuss Method of Moments.  Method of Moments (MoM) is an alternative to Monte Carlo simulation.  Along with the methodology, we will present some pros and cons of using MoM over Monte Carlo.  What is a moment? Before we discuss the methodology behind MoM, we first need to talk about moments.  Caution:  for all the master statisticians out there, this article is meant to boil down complex topics in an easy to understand manner.  There are obviously ...
Original Post Date: Friday, October 5, 2012 I am currently involved in co-authoring a white paper on the “Role of Value Engineering in Affordability Analysis.” For the purposes of this discussion I define affordability as that characteristic of a product or service that responds to the buyer’s price, performance, and availability needs simultaneously. The writing of this white paper has been an interesting exercise for me because my fellow co-authors come from different backgrounds and thus have very different points of view. The majority of my colleagues are “card carrying” Systems Engineers. As such, they have a perspective that ...
Original Post Date: Friday, September 7, 2012 So, what goes on out here in the corn fields, anyway?  Did you know Dayton, Ohio is home to the Air Force Materiel Command?  Air Force Materiel Command (AFMC), headquartered at Wright-Patterson AFB, Ohio, “develops, acquires and sustains the aerospace power needed to defend the United States and its interests for today and tomorrow. This is accomplished through management, research, acquisition, development, testing and maintenance of existing and future weapons systems and their components” (afmc.af.mil).   In response to future budget uncertainty and a challenge by Congress to operate more efficiently, AFMC recently ...
Original Post Date: Tuesday, July 3, 2012  Introduction A cost estimation model needs to be fed data, and it is only as good as data used to create it. Frequently the data needed to ‘feed’ the cost model comes from a variety of sources including engineers, subject matter experts or other individuals not directly building the cost model. Just asking these experts to stick their finger in the air and take a guess isn’t always the best approach. Using the COM enabled Data Input Form Excel solution that comes with TruePlanning can help users obtain the data needed to complete ...
Original Post Date: Tuesday, July 3, 2012 Introduction @RISK and Crystal Ball are two Excel based applications that allow users to perform uncertainty, sensitivity or risk analysis on data contained in Excel spreadsheets. The analysis can be performed using various techniques including Monte Carlo and Latin Hypercube.  Many TruePlanning users are interested in performing this type of analysis on the models they create in TruePlanning or may even have requirements to perform this type of analysis on their estimates.   In response to this desire, PRICE Systems L.L.C. has created two Excel based solutions that allow users to easily leverage the ...
Original Post Date: Monday, April 16, 2012 This year, the PRICE cost research team is kicking off a major research study on electronics.  The field of electronics changes very rapidly, and we want to ensure our estimation methods are well suited for the latest technology.  While we constantly collect data and update relationships, this study will go above and beyond the norm.  We'll visit modern electronics facilities, interview experts, visit customers to discuss their electronics estimating challenges, and start a major data collection and analysis effort.  In the end, we plan to add many new electronic classifications, reexamine our ...
Original Post Date: Thursday, April 5, 2012  Introduction The previous blog covered getting started with the TruePlanning API using VBA. Now it is time to tackle collections. Collections in the TruePlanning API are simply objects that hold a list of objects where all the objects are of the same type.  For example, the Project object contains a collection of the Cost Objects that belong to the project.  In order to use the TruePlanning API, mastering collections is essential, but “mastering” is a bit strong a word given that they are easy to use. By the end of this blog TruePlanning API ...
Original Post Date: Thursday, March 29, 2012 Introduction The new TruePlanning COM Application Programming Interface (API) found in the 2012SR1 release of TruePlanning provides a powerful mechanism for leveraging the power of TruePlanning within custom solutions. Through the COM API TruePlanning projects can be created, updated, calculated and saved allowing for near limitless potential for integration with TruePlanning.  That said it is an “API” which means some programming will need to be done. This discussion is focused on how to get started using the TruePlanning COM API. Development Environments The TruePlanning COM API is written in C++, but is available to any programming ...
Original Post Date: Friday, March 23, 2012 Recently I have been playing around with the International Software Benchmark Standards (ISBSG) database for Development and Enhancement projects.  And in the interest of full disclosure I admit that I am more than a little excited to have close to 6000 data points at my fingertips.  I will further admit that there’s something quite daunting about having this much data; where to start, what should I be looking for, how can I best use this data to offer some useful guidance to inform software cost estimation.  For those of you not familiar ...
Original Post Date: Monday, December 12, 2011 Check out this paper “The Economics of Community Open Source Software Projects: An Empirical Analysis of Maintenance Effort.”  In it the authors hypothesize that Open Source practices increase the quality of the software that gets produced and subsequently lead to code that is less costly to maintain.  Low quality code must be refactored more frequently than high quality code and there is substantial evidence that maintenance interventions tend to lead to even more degradation of the quality of the code.  So not only are low quality applications more expensive to maintain, the unit ...
Original Post Date: Thursday, October 20, 2011  Check out this article.   “Why IT Projects May be Riskier than you Think”.  If you read through the comments you will see that this article truly resonates with many in the field.  In the article the authors discuss research of over 1471 IT projects (large projects with an average cost of $167 million) comparing budgets and expected performance with actual costs and results.  Their results were surprising in that the average overrun was only 27%.  Turns out that the average isn’t what requires study but rather the outliers.  The study found that ...
Original Post Date: Friday, September 2, 2011  The IEEE published “Top 11 Technologies of the Decade” in the  January 2011 editions of the IEEE Spectrum magazine.  It should come to a surprise to no one that the Smartphone was number 1 on this list.  The answer to the author’s question “Is your phone smarter than a fifth grader” was a resounding YES![1]   In 1983 Motorola introduced the first hand hell cellular phone.  It weighed in at two and a half pounds, had memory capacity for 30 phone numbers, took 10 hours to recharge and had a selling price of $4000 ...
Original Post Date: Friday, June 10, 2011 I’m on my way home from the ISPA/SCEA (International Society of Parametric Analysts, Society of Cost Estimating and Analysis) Conference held in Albuquerque this week.  Attendance was very good (2nd best in the conferences history) and as the content seemed especially good this week.  I attended lots of good talks on topics ranging from SEPM (System Engineering, Project Management) cost estimating, Joint Confidence Levels, Software Estimating, Affordability,  Agile software development and estimating for Enterprise Resource Planning Systems.   Of course, just because the topics are good and well presented doesn’t mean I have ...
Original Post Date: Thursday, May 19, 2011 I was recently asked by a client to provide a synopsis of what TruePlanning offers in response to the Ashton Carter Memorandum – Implementation of Will-Cost and Should-Cost Management. In the memo, the Undersecretary of Defense AT&L listed “Selected Ingredients of Should Cost Management”. It was interesting to note how much capability is provided by TruePlanning to effectively support efficient should cost management. In this month’s blog, I will share with my response to our client with you. ...
Original Post Date: Tuesday, March 29, 2011 “I think we have an obligation to work with industry to ensure that our suppliers do not just remain world class in defence, but aspire to be world-class manufactures that can withstand comparison to other industries.” Chief of Defence Procurement, Sir Robert Walmsley Is this a practical proposition or is it a pipe dream?  The following excerpt from Dale Shermon’s Systems Cost Engineering attempts to make the case that this type of comparison is possible. Many of the statements in proposals and marketing literature stating the superiority of a company are anecdotal or at best qualitative ...
Original Post Date: Wednesday, March 16, 2011 In Parametrics is Free, I acknowledged receiving (too late) “you should’ve known to ask that” over the years. Quality control after-the-fact is fine; but it’s better and cheaper to take a systematic approach to quality assurance as part of your estimating process. The sheer volume of what we model can often keep us so close to the details that we are unable to step back and put on our QA hat on for a sanity check. Enter Quality! On a very large project, our team has introduced a few regular cross-checks, notwithstanding typical math check-sums.   A round table peer ...
Original Post Date: Thursday, March 10, 2011 My previous blog discussed a “Should Cost” methodology used by PRICE Systems to complete an analysis. In the article I included a chart depicting calibration results for manufacturing complexities for each weapon system (X-Axis). Manufacturing complexities are a major cost driver within the model. This parameter can be derived from model knowledge tables, generators or from calibration. Many times the calibrated results are simply averaged and used for predicting cost for the new system. This assumes that the new system is very similar in technology and performance as the systems used for calibration. In general this is not the ...
Original Post Date: Monday, February 28, 2011 PRICE Systems recently accepted an assignment to complete a "Should Cost" estimate for a U.S. ally on a weapon system. The estimate included not only analysis on production costs, but also should cost on various operations and support costs. The only information provided by the client was quantity and time frame for production. A major ground rule for the estimate was that all data specific to the weapon system must come from publicly available information.  For example, mass, manufacturing process, and learning curve information must come from the public domain.  After reviewing the scope for the estimate, we decided to also ...
Original Post Date: Tuesday, February 22, 2011 In the February 2011 issue of National Defense, I was struck by the article “Uncertain Path Ahead for Military Truck Fleet”[1]. This article centered on the best strategies for modernization of the aging fleet of Humvees. The recapitalization of 150,000 Army and 25,000 Marine Corps Humvees is creating a “fix or buy new” dilemna for decision makers. According to the article, GAO analyst Michael J. Sullivan should include a “cost-benefit analysis that would minimize the collective acquisition and support costs of the various truck programs, and reduce the risk of overlap or ...
Original Post Date: Thursday, February 17, 2011 My June blog entry suggested the use of parametrics in real-options valuation. This month, I’d like to offer the generalized use of our type of modeling in valuing tangible assets.  Typically, fundamental analysis evaluates the intrinsic value of securities. I won’t attempt to compete with Warren Buffet here. But it is certainly the case that a company, or portfolio of securities reflecting many companies, is based in part on the market value of its product assets and their potential for future earnings, as well as other objective and subjective considerations. In parametric estimation, we take a top-down ...
Original Post Date: Friday, February 11, 2011 The DoD Cost Analysis Symposium (DODCAS 2011) is next week, Feb 15-18.  I’ll be there along with several of my colleagues at PRICE Systems.  This conference consistently provides an excellent source of information and shared experiences for the acquisition community and I am anxious to attend again this year.  Last year the conference occurred shortly after Congress passed the Weapons System Acquisition Reform Act of 2009 (WSARA) - and the majority of the sessions were focused on discussions about how the services, contractors and the government leadership planned on dealing with this new law.  From ...
Original Post Date: Tuesday, January 11, 2011 Last week I gave a webinar which detailed the PRICE perspective on Should Cost & Will Cost Management. The responses I have received have been very positive and also informing. For those of you who could not attend you can view the recorded version of that webinar here. Below is a brief summation of that presentation and some key takeaways. The Under Secretary of Defense issued a memo late last year. The thrust of the memo was the current need for greater efficiency and productivity in defense spending. His guidance contained 23 principal actions for improving the ...
Original Post Date: Thursday, January 6, 2011 Today, PRICE Systems, Senior Research Analyst, Bob Koury, will be presenting on Will Cost/Should Cost management. The presentation will focus on two main requirements mandated in the Ash Carter memo (mentioned here several times): Developing Should Cost/Will Cost targets and establishing Affordability as a requirement.  An example will be provided of how parametric estimating models were used to establish “Should Cost” targets and how they can be used by a budget authority (government or Industry) to be an informed consumer of contractor or sub-contractor bids. The demonstration portion of this webinar will focus on ...