• Predictive Analytics for Improved Cost Management



Blog



The Capability Maturity Model Integration – Development (CMMI-Dev) is a model designed to help organizations achieve and institutionalize process maturity. CMMI specifies the goals that need to be achieved to reach a specific maturity or capability level – it is neither rigid nor prescriptive with how exactly these goals are to be met.   Agile is a philosophy and set of tenets  for software projects that are characterized by highly collaborative, cross-functional teams who work closely with their customers to deliver regular increments of functional software capability that the customers and end users are happy with.  Neither the agile philosophy ...
In the previous (second) blog in this series, we discussed using the NIST Special Publication 800-171 Appendix E to list all possible cyber security requirements.  We then down selected the entire list of 123 items into roughly 60 that may directly impact the software development process.  Now, we will cover how the impact of those 60 items could possibly be included in a TruePlanning® estimate. I will offer three primary methods for accounting for additional effort of cyber security requirements.  We will look at modeling the requirements as individual cost objects in the estimate.  We will then consider setting inputs ...
We will pick up where we left off on estimating the cost of cyber security by looking at requirements.  Recall from a previous blog that the requirements for Cyber Security are outlined in Appendix E of the National Institute of Standards and Technology (NIST) Special Publication 800-171 document titled “Protecting Controlled Unclassified Information in Nonfederal Information Systems and Organizations.”  In Appendix E, there are a series of tables that outline the requirement, as well as the responsible authority for ensuring those requirements are met.  There are four categories of requirements*: NCO: Not directly related to protecting ...
A recent article in the National Defense Magazine highlighted the ever increasing need for cyber security.  (See http://www.nationaldefensemagazine.org/articles/2016/12/12/pentagon-paying-more-to-be-hacked)  When working on a software estimate for a program office here at Wright-Patterson AFB, I was asked “how do you handle cyber security requirements?”  My response was, “What does that mean for your program?  How are the requirements different?”  There was no good answer.  We may be required to incorporate cyber security requirements into a new software project, but there is no really good guidance as to what that exactly means.  We can probably assume that costs are higher for a ...
If you’ve taken my Software Training class, you’ve heard me use the analogy of “taking someone else’s spreadsheet and adding your own logic” to distinguish between modifications, adapted code and glue code.  But let’s take a step back to make sure we’re all in agreement {if not, blame me not the product!} #1.) To be clear, COTS is shrink-wrapped, ready-to-go with near-zero modification to core functionality.  Generally, we really prefer to see COTS modification no more than 10%. #2.) If this latter core functionality needs modification, then we recommend using the SW Component object with Adapted code, as well as Reused ...
We’ve kicked off a study of the cost impacts of various development standards, and this post discusses a customer request on the cost impacts of IEEE/EIA 12207. IEEE 12207 establishes a common framework for software life cycle processes, with well-defined terminology that can be referenced by the software industry [1].  Adherence to this standard helps to eliminate misunderstandings between contractors and procurers and significantly improves chances of mission success, a major part of which is preventing cost and schedule overruns [2, 3]. IEEE 12207 contains a set of management, engineering, and data requirements for all parties involved (acquirers, suppliers, developers, operators, ...
Here’s something I’ve been thinking about a lot lately  - technical debt and its relationships to software maintenance costs.  Technical debt speaks to the structural quality of software applications.  Technical debt is incurred for many different reasons; sometimes it is intentional when shortcuts are taken to meet a time to market requirement; sometimes it occurs because a development team gets sloppy about applying good coding practices (or a development team has not documented coding practices); sometimes it happens when the technology in an application is not kept up to date and it literally gets lapped by technology. Not all technical ...
Original Post Date: Friday, July 9, 2010  While sitting in the operatory chair yesterday, my dentist said something that made me stop. He was complaining about an increasing rate of incompetence and apathy he observes in those delivering services to him. And while I do agree with him in principal, he and I are of the age where some folks label us as grumpy old men. So, it may not be as bad as we think. Regardless, the statement he said he made to the an unfortunate poor-quality service provider was, “If you don’t have the time to do it ...
Original Post Date: Monday, October 18, 2010 Some of us remember taking the Iowa tests during our early school days. The Iowa Tests of Basic Skills (ITBS) are standardized tests provided as a service to schools by the College of Education of The University of Iowa. The tests, administered to students in grades K-8, became a national standard for measuring scholastic aptitude – I was educated in Pennsylvania. Now out of Iowa comes another test of sorts, something called an Integrity Index Score based upon a proprietary algorithm of an organization called Iowa Live. Iowa Live calls itself, “a ...
Original Post Date: Wednesday, November 3, 2010 The midterm elections are finally over. The themes of reduced spending and lower taxes showed up in force at the ballot box. But what does that mean for the defense industry? The U.S. Secretary of Defense, Robert Gates, caused quite a stir when he announced his proposals for reigning in defense spending. There are the expected assortment of eliminations (U.S. Joint Forces Command and Business transformation Agency to name two), reductions (in service support contracts, number of senior civilian executive and general/admiral military officers, and funding for intelligence community advisory contracts), freezes (of ...
Original Post Date: Wednesday, May 12, 2010 From my perspective as a cost researcher, the calibration tool is one of the most powerful analysis capabilities built into the TruePlanning cost management software . One way I can use this tool is to go back to an old estimate for a project that is now completed, and analyze the correctness of the previously entered input values. With this analysis, I can find ways to improve our methods of soliciting input values from the user to ensure the best values are entered the next time. This way, the TruePlanning models keep getting “smarter” as new information ...
Original Post Date: Friday, June 4, 2010 One of the great features of the TruePlanning cost management software is the fact that it makes it easy to handle complications of inflation and estimating projects performed in different countries and currencies. The costs associated with doing work in different countries, and the relative value of different currencies is constantly changing. To address this, the cost research team at PRICE does an annual economic update performed by the cost research team, and this blog will introduce some of basic concepts and research that goes into maintaining this feature every year. The price of goods and ...
Original Post Date: Monday, April 26, 2010  With so many acquisition programs over budget and behind schedule, the term “Cost Realism” is suddenly very popular. In my experience as an estimator on many major acquisition programs, two things have remained certain over years (besides death and taxes). First, the probability of the program ever achieving the original cost estimate is exactly zero and second, the more information that is known about a program, the more it will exceed its original cost estimate.    With that said, the move to Cost Realism is so important because it recognizes these two fundamental ...
Original Post Date: Friday, May 21, 2010 Last month I blogged about the importance of cost realism, its roots and how as estimators we must always reflect the truth, no matter how unpopular. This month I want to share with you a recent experience on a Source Selection. As part of the Source Selection team, my role was to conduct a Cost Realism estimate on each of the performers submitting bids. I want to share with you a few insights from that experience. One of the first rules I always follow is to never ask engineers to provide data that ...
Original Post Date: Wednesday, June 30, 2010  I recently had the opportunity to work directly for one of our clients on a high visibility, must-win proposal. The contractor was just about ready to commit to the bid number, but wanted to know the likely bids of the other two performers. We were asked to do a “Ghosting the Competition” study where we ethically collect open source data on two competing designs and combined with engineering technical data to develop a best cost estimate of the competitor’s bid positions.   Unfortunately, not much intelligence was known about the competing configurations, but the ...
Original Post Date: Tuesday, July 20, 2010 Next month (8/4 @ 12pm EST) I am presenting a webinar to discuss using TruePlanning on Source Selections. What prompted me to develop this webinar were the many recent success stories I’ve had using TruePlanning during the Source Selection process. Going a bit further, I am going to show an actual case study where TruePlanning was used to conduct an Analysis of Alternatives (AoA) exercise – along with cost/effectiveness results. We will explore a bit about the technical side of the proposed designs, develop the modeling in TruePlanning and discuss the results. In addition, we will explore ...
Original Post Date: Wednesday, September 1, 2010 I had expected to present my webinar,  “Best Practices for Cost Effectiveness Studies using TruePlanning” in early August. As you might know, I was planning to show a real world example from a recent engagement with a government customer. Unfortunately, since the Source Selection has not concluded with a downselect, I was not able to obtain the public release in time. However, for this month’s blog I will continue share some of the highlights of the webinar.   In last month’s blog we explored the uses of TruePlanning during Source Selection from the Supplier’s (or ...
Original Post Date: Wednesday, June 23, 2010 Parametric modeling is excellent for all aspects of early-concept cost estimation, including go/no-go decisions downstream. So, in the spirit of bringing a transparency to (ethical) financial engineering… why not apply our craft to pricing “real-options”? The latter are essentially strategic opportunities for engaging resources (cost/schedule) into projects, ventures, investments, or even abandonments. The opportunity choice has value itself!  Unlike static project Net Present Value (often, but not exclusively, approximated with Discounted Cash Flow) assuming pre-defined decisions, real-options reflect the merit of flexibility. If an R&D or proof-of-concept presents viability/ marketability learning, the option has positive value, above ...
Original Post Date: Friday, June 25, 2010  Like titanium and other exotic metal-materials, “composites” (by definition, combinations of materials) offer significant weight-savings and reduced part counts, but at a price of high production cost. Sound contrarian to our parametric cost estimating view?   Not really. Complexity of manufacture is quite higher. Likewise process index and structural tooling values grow. Plus, design lead times drive developmental cycles. That said, understand that composites represent more than a material type. They can involve a highly labor-intensive approach to preparing, braiding/ winding, molding, bonding and modular assemblage. Yes, some aspects of braiding and molding lend themselves to automation—which then drives tooling ...
Original Post Date: Thursday, October 7, 2010 Ahhhh, the 80s… a challenging (but often confusing) time in an evolving computing world.  Working in 1985 as a software estimator as well as SQA engineer in a quality assurance department that “audited” real-time projects using new concepts like OOD & OOP… well, you get the picture.  It was a great time to get immersed into great work.  And the good news:  that company’s process as well as its developers were bullish on a young estimation/ quality types asking plenty of questions… as long as they were of the Yes-No variety.  And ...
Original Post Date: Monday, December 6, 2010  In his August blog-entry here, Zach Jasnoff outlined typical client perspectives for the different types of analyses that TruePlanning can accommodate. Working on a large project, we’ve experienced situations that, realistically, can happen where the initial intent and model structuring later have the boundaries of model appropriateness stretched. An Analysis of Alternatives (AoA), for example, is meant to measure deltas between baseline and its alternatives. If common costs “wash” then they can be excluded… which becomes an issue when treated as a Rough Order Magnitude for customer budgeting.  Likewise, if a ROM or Independent Cost Estimate ...
Original Post Date: Tuesday, August 24, 2010 Over the past several weeks several users have inquired about the best way to estimate costs associated with porting existing software to a new hardware environment. Normally for this situation some of the existing software will require some amount of adaptation to operate on a new server. However, a large portion of the existing software will only require integration into the new environment.   Estimating software costs associated with the above will require the use of several cost objects: - Systems cost object if program management, Quality Assurance, configuration, and    documentation costs are to be included in ...
Introduction: The goal of this blog is to show how data can flow between TruePlanning and ProPricer. This walkthrough is based on estimating a software application that will provide users the ability to track objects orbiting the Earth using a feed from some fictitious data stream. The benefit is the ability to get the labor requirement (effort in hours) estimated by TruePlanning into ProPricer in a seamless, easily repeatable process.   1. Create ProPricer Proposal for the Orbiting Body Tracking application The first step is to create a proposal in ProPricer with a WBS. Each task in ProPricer will have a set of ...
Original Post Date: Thursday, July 17, 2014 Introduction Parametric cost estimates provide high quality, defendable estimates early in a project’s life cycle. This makes them ideal when producing bid and proposals. The nature of parametric cost estimates, however, requires the results of the estimate to be framed in terms of specific CERs and Activities and Resources. It is common for an organization to have a more granular set of Resources than the ones used to support the CERs. One approach to resolving this issue would be to use the TrueMapper application from PRICE Systems to map TruePlanning Resources to a more ...
Original Post Date: Thursday, July 10, 2014 Introduction TruePlanning provides a powerful and highly customizable reporting environment. Project data can be viewed from many different perspectives and those perspectives can be saved and reused across all projects. Results can be exported to Excel and Word as custom reports. There are, however, some instances where there is a need to get beyond the two axes used in TruePlanning’s reporting engine. This need is sometimes expressed by users preparing TruePlanning cost estimating project data for use in a bid or proposal. Perhaps the data needs to be split by phase and labor/non-labor over ...
Original Post Date: Tuesday, July 1, 2014 Whether you’re doing a software cost estimate to support a Bid and Proposal effort, a software valuation, should cost analysis, or to develop a detailed project plan, it is vitally important to understand the ‘size’ of the software you are estimating.  The problem with software size is that it tends to fall into the intangible realm of reality.  If you tell me you are building a widget that weighs 13 pounds, I can really start to get my head around the task at hand.  If I’m chatting about this with my European colleagues, ...
Original Post Date: Friday, June 20, 2014 Proposal estimates based on grassroots engineering judgment are necessary to achieve company buy-in, but often are not convincing or not in sync with the price-to-win.  This contention can be resolved through by comparing the grassroots estimate to an estimate developed using data driven parametric techniques.  Parametric estimates apply statistical relationships to project data to determine likely costs for a project.  Of course, for a parametric model to properly support this cross check of the grassroots estimate, the proper data must be fed into the model.  This most likely requires the estimator to reach ...
Original Post Date: Wednesday, April 2, 2014 Introduction Parametric estimates provide reliable, reproducible, and flexible views into cost and effort so it’s only natural to want to include this data in a bid and proposal workflow. With TruePlanning 2014 big steps have been taken to make such integration seamless and easily reproducible.  New tools in the TruePlanning suite of products, as well as, integrations with some of the major bid and proposal software applications are at the heart of this new feature set. You can learn more about TruePlanning 2014 and the PRICE cost estimation models at our website, but let's ...
Original Post Date: Thursday, March 20, 2014 Here’s a conundrum.  You are a software estimator responsible for helping the decision makers in your company determine what business to pursue and what business to steer clear of.  You know, that to win profitable business, your company first needs to decide which opportunities are golden and which should be avoided.  You also know, that at the point at which this decision needs to be made, there is very little information available to support a quality estimate.  Add to this the fact that software estimation is hard   at almost any stage.  What’s ...
Original Post Date: Thursday, March 20, 2014 One of the complications in generating Bids and Proposals for Modules and Microcircuits is determining the “Should Cost” for better cost realism. Most of the electronic modules and their components in the proposals are not actually manufactured by the Proposer, but rather by a subcontractor, thus becoming a Purchased item. It is difficult to determine the cost of making the Module, and determining a fair cost. Costs for the modules include Assembly and Test costs together with the component costs. Components such as ASIC’s (Application Specific Integrated Circuits), have both the cost of developing the devices and ...
Original Post Date: Monday, December 30, 2013 Unless you live under a rock, you are aware of the healthcare.gov rollout disaster.  While similar IT failures are regularly in the news, the high profile of healthcare.gov has really mainstreamed awareness of the fragility of many IT projects.  Check out this article entitled ‘The Worst IT project disasters of 2013’.  It details IT project failures such as:  IBM’s failure to deliver on a payroll system project that could potentially cost taxpayers up to $1.1 Billion dollars US.    SAP’s failure to deliver satisfactorily on requirements for ...
Original Post Date: Friday, October 4, 2013 In Parametrics is Free, I acknowledged receiving (too late) “you should’ve known to ask that” over the years. Quality control after-the-fact is fine; but it’s better and cheaper to take a systematic approach to quality assurance as part of your estimating process. The sheer volume of what we model can often keep us so close to the details that we are unable to step back and put on our QA hat on for a sanity check. Enter Quality! On a very large project, our team has introduced a few regular cross-checks, notwithstanding typical ...
Original Post Date: Friday, October 4, 2013 Like titanium and other exotic metal-materials, “composites” (by definition, combinations of materials) offer significant weight-savings and reduced part counts, but at a price of high production cost. Sound contrarian to our parametric cost estimating view? Not really. Complexity of manufacture is quite higher. Likewise process index and structural tooling values grow. Plus, design lead times drive developmental cycles. That said, understand that composites represent more than a material type. They can involve a highly labor-intensive approach to preparing, braiding/ winding, molding, bonding and modular assemblage. Yes, some aspects of braiding and molding lend themselves ...
Original Post Date: Friday, October 4, 2013 Parametric modeling is excellent for all aspects of early-concept cost estimation, including go/no-go decisions downstream. So, in the spirit of bringing a transparency to (ethical) financial engineering… why not apply our craft to pricing “real-options”? The latter are essentially strategic opportunities for engaging resources (cost/schedule) into projects, ventures, investments, or even abandonments. The opportunity choice has value itself! Unlike static project Net Present Value (often, but not exclusively, approximated with Discounted Cash Flow) assuming pre-defined decisions, real-options reflect the merit of flexibility. If an R&D or proof-of-concept presents viability/marketability learning, the option has positive ...
Original Post Date: Friday, October 4, 2013 Ahhhh, the 80s… a challenging (but often confusing) time in an evolving computing world. Working in 1985 as a software estimator as well as SQA engineer in a quality assurance department that “audited” real-time projects using new concepts like OOD & OOP… well, you get the picture. It was a great time to get immersed into great work. And the good news: that company’s process as well as its developers were bullish on a young estimation/ quality type asking plenty of questions… as long as they were of the Yes-No variety. And ask ...
Original Post Date: Friday, October 4, 201 My "Real Options Valuation" blog suggested the use of parametrics in real options valuation. I’d like to offer the generalized use of our type of modeling in valuing tangible assets. Typically, fundamental analysis evaluates the intrinsic value of securities. I won’t attempt to compete with Warren Buffet here. But it is certainly the case that a company, or portfolio of securities reflecting many companies, is based in part on the market value of its product assets and their potential for future earnings, as well as other objective and subjective considerations. In parametric estimation, ...
Original Post Date: Wednesday, September 25, 2013 The “Systems Folder” cost object which is found at the start of every TruePlanning Project is most often confused with the “Folder” icon. These two however should not be confused. The “Folder” icon does not have an input sheet at all. It is not a cost object and contains no cost estimating logic or relationships.  It is provided as a collection point so that cost objects can be grouped in ways for clarity like to separate out phases of the acquisition lifecycle or to divide costs between subcontractors, etc.  Whereas, the “System Folder” contains all ...
Original Post Date: Wednesday, September 25, 2013 We may all agree that risk analysis is a necessary, vital part of any valid/defensible cost estimate.  We may not agree as much on the best approach to take to quantify risk in an estimate.  All estimates contain risk.  In the words of a wise cost estimator I know, “That’s why they’re called estimates, and not exactimates!”  We must quantify and manage levels of risk.  Why?  One vital part of a successful program is the ability to build a budget based on reliable cost projections.  Reliability increases when we can analyze inherent risk, ...
Original Post Date: Wednesday, September 25, 2013 A lot of clients have been expressing interest in modeling ASICs, FPGAs, and various other electronic modules inside TruePlanning® (TP). In the release of TruePlanning® 2014 there will now be the capability to model all these products inside our framework. Not only will you be able to model these products but you will of course be able to model the integration cost of these electronic components with Hardware and Software components. In addition you would be able to add and estimate the program management of your total project through our integrated framework. TruePlanning Microcircuits ...
Original Post Date: Wednesday, September 25, 2013 “Integration Factors – What Makes TruePlanning™ Stand Out in the Crowd” In today’s world, system integration is becoming more and more important. The government has started asking for designs that have additional capabilities, which allow connectivity both with systems under construction and systems already in use and deployed. The reason systems integration is important is because it adds value to the system by adding abilities that are now possible because of new interactions between subsystems. In a recently posted article on “The True Costs of Integration” the writer defined the costs of a typical integration ...
Original Post Date: Wednesday, September 25, 2013 These days bidding can be a game, and contractor leadership is constantly making decisions on whether to take on risk in order to stay competitive or to bid conservatively for the safety of not overrunning.  You may complete a cost model for a program, and spend time analyzing the uncertainties behind each input and in the end find that your estimate lands at the 30% confidence level.  After some strategic analysis, the bid leadership team decides, we would like to bid at the 80% Confidence level, “please present your estimate to support that total”.  ...
Original Post Date: Wednesday, September 25, 2013 During a recent Analysis of Alternatives (“AoA”) consulting project, our customer asked that we provide more insight into TruePlanning’s System and Assembly objects, which in our AoA context we termed Systems Engineering/ Program Management (SE/PM) and Alternative Integration/ Test, respectively. The customer’s challenge was understanding our parametric model’s treatment of principally hardware-COTS objects, combined with other cost, purchased service and training objects.Our Chief Scientist, Arlene Minkiewicz, provided us with insights that I’d like to share with you, as well as my views on how we at PRICE systems have consistently used these parent ...
Original Post Date: Wednesday, September 25, 2013 It is impossible to find a news website or magazine that is not full of articles on the effects of Sequestration.  As a cost estimator, I find the topic very interesting (and troublesome).  The immediate effects of Sequestration are widely discussed.  However, I do not see quite as much news coverage on the second and third order effects of this extremely complex policy decision. The Department of Defense (DoD) has a specific target that must be removed from the budget over the next 10 years.  Some analysts claim a doomsday scenario.  Others claim it ...
Original Post Date: Wednesday, September 25, 2013 In a recent National Public Radio (NPR) interview, Winslow Wheeler (Director of the Straus Military Reform Project of the Project on Government Oversight in Washington, D.C.), spoke on the recent problems with the Joint Strike Fighter acquisition process.  “Wheeler explained that Lockheed Martin, the manufacturer of the jet, uses a pricing vocabulary that masks costs. ‘Flyaway costs, non-recurring and recurring costs, and lots of gobbledygook, and they’ll say that comes to a number like $60-$70 million dollars. And, it’s complete baloney,’ said Wheeler.” (pogo.org)    The F-35 has the distinction of being the most ...
Original Post Date: Wednesday, September 25, 2013 Last November I hosted a webinar that discussed the use of the companion applications Live! This session helped to further explain how to use them in congruence with TP, the history, & why we created them, etc. During the presentation I showcased the success’s I have encountered with using them both in the recent AOA I described in part 2 and this blog part 3. You can find the recorded webinar on our site. In addition I described as I am going to do here the differences between the large project engine and the excel ...
Original Post Date: Wednesday, September 25, 2013 Every day we use tools like TruePlanning to build up detailed parametric cost estimates.  We could spend weeks collecting data and design information, and weeks honing details on risks and uncertainties.  When we finally get to reasonable point estimate, or even a distribution of probable estimates, there are always more questions.  Of course the range of queries depends on the purpose of the estimate, and who your consumer is.  If you are preparing an estimate for a competitive proposal, a company executive may be your consumer.  They may want to know, “What is the ...
Original Post Date: Wednesday, October 17, 2012 A frequent question from students and consulting clients is how to estimate software size when either: detailed functional requirements descriptions are not yet documented or, even if the latter do exist, the resources necessary (in cost and time) for detailed function point (“FP”) counting are prohibitive. If appropriate analogies or detailed use cases are not available, fast function point counting can be a non-starter, without nominal understanding of pre-design software transactions and data functions.  Hence, the challenge is to find an estimating basis for functional measure (i.e., ...
Original Post date: Wednesday, October 10, 2012 Deciding whether Excel® is a friend or foe is a hefty topic, so I decided to dedicate several blog posts to the issue!  This first posts addresses all of PRICE’s new boundless Companion Applications.  The second will address my experience using the applications with our customers (Do’s and Don’ts); and lastly, the third and final blog will wrap it up and explain a more in-depth large project engine that PRICE is currently testing.  As we all know Microsoft Excel® is a powerhouse tool.  It allows you to house data, format it in many ...
Original Post Date: Thursday, September 27, 201 I am frequently questioned by clients and prospects about the applicability of PRICE’s parametric software estimation model to agile software development projects.  There are several ways one could respond to this.  My first thought is that if a shop is truly agile, they don’t need an estimation tool.  They know their development team velocity because agile teams are committed to measurement.  They also either know when they need to make a delivery – in which case whatever amount of software they’ve build by that point will be released.  Alternatively they may know ...
Original Post Date: Tuesday, October 2, 2012 This past year PRICE Systems has entered into a partnership with the International Benchmark Standards Group (ISBSG).  As part of this partnership we have a corporate subscription to both of their databases – the Development and Enhancement Database and the Maintenance and Support Database.  We can use these for analysis and to develop metrics that will help TruePlanning users be better software estimators.  The ISBSG is one of the oldest and most trusted sources for software project data.  They are a not for profit organization dedicated to improving software measurement at an international ...