• Predictive Analytics for Improved Cost Management  



Blog



Field Programmable Gate Arrays (FPGA) are integrated circuits designed to be configured by a designer after manufacturing.  In recent years, FPGA usage has been increasing at a rapid pace, as their capability (speed, energy efficiency, amount of logic that can fit on the chip, etc.) has come to rival ASICs.  As both the number and size of FPGA projects has increased, improving methods of cost estimation of these projects is becoming more critical for project success. FPGA development combines aspects of both hardware development and software development.  These projects begin with architectural design and writing code in a Hardware Description ...
Check out this article on “The History and Purpose of the Capability Maturity Model (CMM)” (https://toughnickel.com/business/The-History-and-Purpose-of-the-Capability-Maturity-Model-CMM) It provides an interesting and thought provoking accounting of how the Carnegie Mellon University’s (CMU’s) Software Engineering Institute (SEI) came to be and how the quest of NASA and the US Air Force lead the charge to improve software quality.  According to the article – “The Capability Maturity Model was developed to ensure success when success really matters – at NASA and in the military where lives are on the line and success is survival”.  The problem the industry had with this quest ...
We’ve kicked off a study on the cost impacts of various quality assurance standards, and this post gives our preliminary results for modeling DO-178c and DO-254 in TruePlanning®.  DO-178c and DO-254 are standards that deal with the safety of software and electronics used in airborne systems.  It began as a standard used predominately by the U.S. Federal Aviation Administration (FAA) for commercial aircraft, and its usage has spread significantly to the U.S. military and many other countries. All of the software and electronics on-board an aircraft are categorized into 5 Design Assurance Levels (DAL) based on how failure of ...
Original Post Date: Friday, June 4, 2010 One of the great features of the TruePlanning cost management software is the fact that it makes it easy to handle complications of inflation and estimating projects performed in different countries and currencies. The costs associated with doing work in different countries, and the relative value of different currencies is constantly changing. To address this, the cost research team at PRICE does an annual economic update performed by the cost research team, and this blog will introduce some of basic concepts and research that goes into maintaining this feature every year. The price of goods and ...
Original Post Date: Wednesday, June 30, 2010  I recently had the opportunity to work directly for one of our clients on a high visibility, must-win proposal. The contractor was just about ready to commit to the bid number, but wanted to know the likely bids of the other two performers. We were asked to do a “Ghosting the Competition” study where we ethically collect open source data on two competing designs and combined with engineering technical data to develop a best cost estimate of the competitor’s bid positions.   Unfortunately, not much intelligence was known about the competing configurations, but the ...
Original Post Date: Wednesday, September 1, 2010 I had expected to present my webinar,  “Best Practices for Cost Effectiveness Studies using TruePlanning” in early August. As you might know, I was planning to show a real world example from a recent engagement with a government customer. Unfortunately, since the Source Selection has not concluded with a downselect, I was not able to obtain the public release in time. However, for this month’s blog I will continue share some of the highlights of the webinar.   In last month’s blog we explored the uses of TruePlanning during Source Selection from the Supplier’s (or ...
Original Post Date: Wednesday, November 10, 2010 I was recently struck by Ash Carter’s (Under Secretary of Defense for Acquisition, Technology & Logistics) Memorandum for Acquisition Professionals, Better Buying Power: Guidance for Obtaining Greater Efficiency and Productivity in Defense Spending (14 September 2010). Within this broad sweeping memo, Ash Carter outlines 23 principal actions in five major areas aimed at increasing efficiency in Defense acquisition.  The first major area covered is “Target Affordability and Control Cost Growth”. Within this major area, program managers must treat affordability as a requirement before milestone authority is granted to proceed (starting with Milestone A). This ...
Original Post Date: Wednesday, June 23, 2010 Parametric modeling is excellent for all aspects of early-concept cost estimation, including go/no-go decisions downstream. So, in the spirit of bringing a transparency to (ethical) financial engineering… why not apply our craft to pricing “real-options”? The latter are essentially strategic opportunities for engaging resources (cost/schedule) into projects, ventures, investments, or even abandonments. The opportunity choice has value itself!  Unlike static project Net Present Value (often, but not exclusively, approximated with Discounted Cash Flow) assuming pre-defined decisions, real-options reflect the merit of flexibility. If an R&D or proof-of-concept presents viability/ marketability learning, the option has positive value, above ...
Original Post Date: Friday, June 25, 2010  Like titanium and other exotic metal-materials, “composites” (by definition, combinations of materials) offer significant weight-savings and reduced part counts, but at a price of high production cost. Sound contrarian to our parametric cost estimating view?   Not really. Complexity of manufacture is quite higher. Likewise process index and structural tooling values grow. Plus, design lead times drive developmental cycles. That said, understand that composites represent more than a material type. They can involve a highly labor-intensive approach to preparing, braiding/ winding, molding, bonding and modular assemblage. Yes, some aspects of braiding and molding lend themselves to automation—which then drives tooling ...
Original Post Date: Thursday, October 7, 2010 Ahhhh, the 80s… a challenging (but often confusing) time in an evolving computing world.  Working in 1985 as a software estimator as well as SQA engineer in a quality assurance department that “audited” real-time projects using new concepts like OOD & OOP… well, you get the picture.  It was a great time to get immersed into great work.  And the good news:  that company’s process as well as its developers were bullish on a young estimation/ quality types asking plenty of questions… as long as they were of the Yes-No variety.  And ...
Original Post Date: Tuesday, August 24, 2010 Over the past several weeks several users have inquired about the best way to estimate costs associated with porting existing software to a new hardware environment. Normally for this situation some of the existing software will require some amount of adaptation to operate on a new server. However, a large portion of the existing software will only require integration into the new environment.   Estimating software costs associated with the above will require the use of several cost objects: - Systems cost object if program management, Quality Assurance, configuration, and    documentation costs are to be included in ...
Original Post Date: Monday, June 7, 2010 Currently we are exploring the best approach to including a more comprehensive cost estimate for Total Ownership Costs (TOC) into TruePlanning. The current version of the software has focused on development and production costs with some life cycle costing including. The life cycle costs included are focused on the system specific O&S costs such as initial spares for priming the supply pipeline, maintenance, replenishment spares, etc. It is a system view as opposed to a program view of TOC. As we better understand the need to conduct affordability studies it has become clear that design decisions ...
Original Post Date: Monday, September 20, 2010 I have been fortunate in my career to have been associated with some great mentors. Each individual has provided me a bit of a golden nugget to carry with me as I tried to navigate my way through the professional waters. My first “civilian” manager, after I left the service and joined industry, provided me a list of the Laws of Analysis (I had just started a position as an operations research analyst). He explained that this list was a mix of serious and tongue in cheek snippets of wisdom. I looked at ...
Introduction: The goal of this blog is to show how data can flow between TruePlanning and ProPricer. This walkthrough is based on estimating a software application that will provide users the ability to track objects orbiting the Earth using a feed from some fictitious data stream. The benefit is the ability to get the labor requirement (effort in hours) estimated by TruePlanning into ProPricer in a seamless, easily repeatable process.   1. Create ProPricer Proposal for the Orbiting Body Tracking application The first step is to create a proposal in ProPricer with a WBS. Each task in ProPricer will have a set of ...
Original Post Date: Thursday, July 10, 2014 Introduction TruePlanning provides a powerful and highly customizable reporting environment. Project data can be viewed from many different perspectives and those perspectives can be saved and reused across all projects. Results can be exported to Excel and Word as custom reports. There are, however, some instances where there is a need to get beyond the two axes used in TruePlanning’s reporting engine. This need is sometimes expressed by users preparing TruePlanning cost estimating project data for use in a bid or proposal. Perhaps the data needs to be split by phase and labor/non-labor over ...
Original Post Date: Tuesday, July 1, 2014 Whether you’re doing a software cost estimate to support a Bid and Proposal effort, a software valuation, should cost analysis, or to develop a detailed project plan, it is vitally important to understand the ‘size’ of the software you are estimating.  The problem with software size is that it tends to fall into the intangible realm of reality.  If you tell me you are building a widget that weighs 13 pounds, I can really start to get my head around the task at hand.  If I’m chatting about this with my European colleagues, ...
Original Post Date: Friday, June 20, 2014 Proposal estimates based on grassroots engineering judgment are necessary to achieve company buy-in, but often are not convincing or not in sync with the price-to-win.  This contention can be resolved through by comparing the grassroots estimate to an estimate developed using data driven parametric techniques.  Parametric estimates apply statistical relationships to project data to determine likely costs for a project.  Of course, for a parametric model to properly support this cross check of the grassroots estimate, the proper data must be fed into the model.  This most likely requires the estimator to reach ...
Original Post Date: Wednesday, April 2, 2014 Introduction Parametric estimates provide reliable, reproducible, and flexible views into cost and effort so it’s only natural to want to include this data in a bid and proposal workflow. With TruePlanning 2014 big steps have been taken to make such integration seamless and easily reproducible.  New tools in the TruePlanning suite of products, as well as, integrations with some of the major bid and proposal software applications are at the heart of this new feature set. You can learn more about TruePlanning 2014 and the PRICE cost estimation models at our website, but let's ...
Original Post Date: Thursday, March 20, 2014 Here’s a conundrum.  You are a software estimator responsible for helping the decision makers in your company determine what business to pursue and what business to steer clear of.  You know, that to win profitable business, your company first needs to decide which opportunities are golden and which should be avoided.  You also know, that at the point at which this decision needs to be made, there is very little information available to support a quality estimate.  Add to this the fact that software estimation is hard   at almost any stage.  What’s ...
Original Post Date: Thursday, March 20, 2014 One of the complications in generating Bids and Proposals for Modules and Microcircuits is determining the “Should Cost” for better cost realism. Most of the electronic modules and their components in the proposals are not actually manufactured by the Proposer, but rather by a subcontractor, thus becoming a Purchased item. It is difficult to determine the cost of making the Module, and determining a fair cost. Costs for the modules include Assembly and Test costs together with the component costs. Components such as ASIC’s (Application Specific Integrated Circuits), have both the cost of developing the devices and ...
Original Post Date: Wednesday, October 30, 2013 Agile development practices have enabled software development organizations to deliver quality software that optimizes the customer’s satisfaction with the value they receive for their money.  That being said, agile development may not be the best approach for every software development project.  Alistair Cockburn, agile development specialist and one of the initiators of the agile software development movement, acknowledges that “agile is not for every project”.  Further elucidating this point, Cockburn opines:  “small projects, web projects, exploratory projects, agile is fabulous; it beats the pants off of everything else, but for NASA, no”. .  ...
Original Post Date: Friday, October 4, 2013 Like titanium and other exotic metal-materials, “composites” (by definition, combinations of materials) offer significant weight-savings and reduced part counts, but at a price of high production cost. Sound contrarian to our parametric cost estimating view? Not really. Complexity of manufacture is quite higher. Likewise process index and structural tooling values grow. Plus, design lead times drive developmental cycles. That said, understand that composites represent more than a material type. They can involve a highly labor-intensive approach to preparing, braiding/ winding, molding, bonding and modular assemblage. Yes, some aspects of braiding and molding lend themselves ...
Original Post Date: Friday, October 4, 2013 Parametric modeling is excellent for all aspects of early-concept cost estimation, including go/no-go decisions downstream. So, in the spirit of bringing a transparency to (ethical) financial engineering… why not apply our craft to pricing “real-options”? The latter are essentially strategic opportunities for engaging resources (cost/schedule) into projects, ventures, investments, or even abandonments. The opportunity choice has value itself! Unlike static project Net Present Value (often, but not exclusively, approximated with Discounted Cash Flow) assuming pre-defined decisions, real-options reflect the merit of flexibility. If an R&D or proof-of-concept presents viability/marketability learning, the option has positive ...
Original Post Date: Friday, October 4, 2013 Ahhhh, the 80s… a challenging (but often confusing) time in an evolving computing world. Working in 1985 as a software estimator as well as SQA engineer in a quality assurance department that “audited” real-time projects using new concepts like OOD & OOP… well, you get the picture. It was a great time to get immersed into great work. And the good news: that company’s process as well as its developers were bullish on a young estimation/ quality type asking plenty of questions… as long as they were of the Yes-No variety. And ask ...
Original Post Date: Friday, October 4, 201 My "Real Options Valuation" blog suggested the use of parametrics in real options valuation. I’d like to offer the generalized use of our type of modeling in valuing tangible assets. Typically, fundamental analysis evaluates the intrinsic value of securities. I won’t attempt to compete with Warren Buffet here. But it is certainly the case that a company, or portfolio of securities reflecting many companies, is based in part on the market value of its product assets and their potential for future earnings, as well as other objective and subjective considerations. In parametric estimation, ...
Original Post Date: Wednesday, September 25, 2013 The “Systems Folder” cost object which is found at the start of every TruePlanning Project is most often confused with the “Folder” icon. These two however should not be confused. The “Folder” icon does not have an input sheet at all. It is not a cost object and contains no cost estimating logic or relationships.  It is provided as a collection point so that cost objects can be grouped in ways for clarity like to separate out phases of the acquisition lifecycle or to divide costs between subcontractors, etc.  Whereas, the “System Folder” contains all ...
Original Post Date: Wednesday, September 25, 2013 We may all agree that risk analysis is a necessary, vital part of any valid/defensible cost estimate.  We may not agree as much on the best approach to take to quantify risk in an estimate.  All estimates contain risk.  In the words of a wise cost estimator I know, “That’s why they’re called estimates, and not exactimates!”  We must quantify and manage levels of risk.  Why?  One vital part of a successful program is the ability to build a budget based on reliable cost projections.  Reliability increases when we can analyze inherent risk, ...
Original Post Date: Wednesday, September 25, 2013 A lot of clients have been expressing interest in modeling ASICs, FPGAs, and various other electronic modules inside TruePlanning® (TP). In the release of TruePlanning® 2014 there will now be the capability to model all these products inside our framework. Not only will you be able to model these products but you will of course be able to model the integration cost of these electronic components with Hardware and Software components. In addition you would be able to add and estimate the program management of your total project through our integrated framework. TruePlanning Microcircuits ...
Original Post Date: Wednesday, September 25, 2013 “Integration Factors – What Makes TruePlanning™ Stand Out in the Crowd” In today’s world, system integration is becoming more and more important. The government has started asking for designs that have additional capabilities, which allow connectivity both with systems under construction and systems already in use and deployed. The reason systems integration is important is because it adds value to the system by adding abilities that are now possible because of new interactions between subsystems. In a recently posted article on “The True Costs of Integration” the writer defined the costs of a typical integration ...
Original Post Date: Wednesday, September 25, 2013 Introduction/Problem Statement A current client has expressed interest in the default values on the simple vs. the detailed input sheet. More specifically the question arose because this particular customer as well as others had a misconception about the simple vs. the detailed input sheet default values. Most users did not realize that if they were only inputting values on the simple input sheet that the detailed input sheet default values were still be used in the calculation for their cost estimate. So the question became how much are each of these default value inputs ...
Original Post Date: Wednesday, September 25, 2013 These days bidding can be a game, and contractor leadership is constantly making decisions on whether to take on risk in order to stay competitive or to bid conservatively for the safety of not overrunning.  You may complete a cost model for a program, and spend time analyzing the uncertainties behind each input and in the end find that your estimate lands at the 30% confidence level.  After some strategic analysis, the bid leadership team decides, we would like to bid at the 80% Confidence level, “please present your estimate to support that total”.  ...
Original Post Date: Wednesday, September 25, 2013 Background: FPGA design typically uses a library of Tiles (Sets of gates and transistors) and a CAD system to physically lay out the actual active devices on an empty portion of substrate. These devices are interconnected by physical (usually copper) traces to route the signals and perform the desired tasks. The design may be totally customized for a single set of functions and may not need any form of programming. Other designs may allow some parts of the device to be electronically re-programmed to allow the device to be calibrated or adjusted for ...
Original Post Date: Wednesday, September 25, 2013 During a recent Analysis of Alternatives (“AoA”) consulting project, our customer asked that we provide more insight into TruePlanning’s System and Assembly objects, which in our AoA context we termed Systems Engineering/ Program Management (SE/PM) and Alternative Integration/ Test, respectively. The customer’s challenge was understanding our parametric model’s treatment of principally hardware-COTS objects, combined with other cost, purchased service and training objects.Our Chief Scientist, Arlene Minkiewicz, provided us with insights that I’d like to share with you, as well as my views on how we at PRICE systems have consistently used these parent ...
Original Post Date: Wednesday, September 25, 2013 It is impossible to find a news website or magazine that is not full of articles on the effects of Sequestration.  As a cost estimator, I find the topic very interesting (and troublesome).  The immediate effects of Sequestration are widely discussed.  However, I do not see quite as much news coverage on the second and third order effects of this extremely complex policy decision. The Department of Defense (DoD) has a specific target that must be removed from the budget over the next 10 years.  Some analysts claim a doomsday scenario.  Others claim it ...
Original Post Date: Wednesday, September 25, 2013 In a recent National Public Radio (NPR) interview, Winslow Wheeler (Director of the Straus Military Reform Project of the Project on Government Oversight in Washington, D.C.), spoke on the recent problems with the Joint Strike Fighter acquisition process.  “Wheeler explained that Lockheed Martin, the manufacturer of the jet, uses a pricing vocabulary that masks costs. ‘Flyaway costs, non-recurring and recurring costs, and lots of gobbledygook, and they’ll say that comes to a number like $60-$70 million dollars. And, it’s complete baloney,’ said Wheeler.” (pogo.org)    The F-35 has the distinction of being the most ...
Original Post Date: Wednesday, September 25, 2013 Last November I hosted a webinar that discussed the use of the companion applications Live! This session helped to further explain how to use them in congruence with TP, the history, & why we created them, etc. During the presentation I showcased the success’s I have encountered with using them both in the recent AOA I described in part 2 and this blog part 3. You can find the recorded webinar on our site. In addition I described as I am going to do here the differences between the large project engine and the excel ...
Original Post Date: Wednesday, September 25, 2013 We have been blogging a lot lately about finding the right manufacturing complexities for your hardware  projects using TrueFindings, and the many ways you can determine the most appropriate Manufacturing complexities: taking averages from related projects, finding a correlation to another descriptive parameter (like horsepower for engines), or even looking into multi-variate regression to determine the best value.  If you are a TruePlanning user, you know that Manufacturing Complexities (for structure or electronics) are the most common parameter to use in calibration, and that the manufacturing complexity drives both Production and development ...
Original Post Date: Tuesday, September 24, 201 Risk Analysis Methodology Overview – Method of Moments In this second of three articles on risk analysis, we will discuss Method of Moments.  Method of Moments (MoM) is an alternative to Monte Carlo simulation.  Along with the methodology, we will present some pros and cons of using MoM over Monte Carlo.  What is a moment? Before we discuss the methodology behind MoM, we first need to talk about moments.  Caution:  for all the master statisticians out there, this article is meant to boil down complex topics in an easy to understand manner.  There are obviously ...
Original Post date: Wednesday, October 10, 2012 Deciding whether Excel® is a friend or foe is a hefty topic, so I decided to dedicate several blog posts to the issue!  This first posts addresses all of PRICE’s new boundless Companion Applications.  The second will address my experience using the applications with our customers (Do’s and Don’ts); and lastly, the third and final blog will wrap it up and explain a more in-depth large project engine that PRICE is currently testing.  As we all know Microsoft Excel® is a powerhouse tool.  It allows you to house data, format it in many ...
Original Post Date: Tuesday, October 2, 2012 This past year PRICE Systems has entered into a partnership with the International Benchmark Standards Group (ISBSG).  As part of this partnership we have a corporate subscription to both of their databases – the Development and Enhancement Database and the Maintenance and Support Database.  We can use these for analysis and to develop metrics that will help TruePlanning users be better software estimators.  The ISBSG is one of the oldest and most trusted sources for software project data.  They are a not for profit organization dedicated to improving software measurement at an international ...
Original Post Date: Tuesday, October 2, 2012 We’re building a tool that quickly and easily allows you to map costs in your TruePlanning(R) projects to any custom Work Breakdown Structure (WBS), including templates for the MIL-STD-881C.  By mapping TruePlanning costs to fit your point of view, you can make use of your organization’s existing research, and ensure that your cost estimates are complete, realistic, and can be easily compared to other projects apples-to-apples.  This is great for AoAs and analyzing where and why costs differ between various solutions (not to mention 881C mapping is a required deliverable for major ...
Original Post Date: Thursday, September 27, 2012 I am frequently questioned by clients and prospects about the applicability of PRICE’s parametric software estimation model to agile software development projects.  There are several ways one could respond to this.  My first thought is that if a shop is truly agile, they don’t need an estimation tool.  They know their development team velocity because agile teams are committed to measurement.  They also either know when they need to make a delivery – in which case whatever amount of software they’ve build by that point will be released.  Alternatively they may know a ...
Original Post Date: Tuesday, July 3, 2012  Introduction A cost estimation model needs to be fed data, and it is only as good as data used to create it. Frequently the data needed to ‘feed’ the cost model comes from a variety of sources including engineers, subject matter experts or other individuals not directly building the cost model. Just asking these experts to stick their finger in the air and take a guess isn’t always the best approach. Using the COM enabled Data Input Form Excel solution that comes with TruePlanning can help users obtain the data needed to complete ...
Original Post Date: Tuesday, June 5, 2012 Ever wonder what programming languages are the most productive?  I recently did a little research into this topic using the International Software Benchmark Standards Group (ISBSG) database. The database contains over 5000 data points with size and effort data for projects from a wide variety of industries, applications and counties.  Of course not all 5000 data points were suitable for my investigation.  The software size is measured using functional size metrics but the database accepts projects that use various counting methods.  I narrowed my search to projects that used the International Function Points ...
Original Post Date: Friday, March 23, 2012 Recently I have been playing around with the International Software Benchmark Standards (ISBSG) database for Development and Enhancement projects.  And in the interest of full disclosure I admit that I am more than a little excited to have close to 6000 data points at my fingertips.  I will further admit that there’s something quite daunting about having this much data; where to start, what should I be looking for, how can I best use this data to offer some useful guidance to inform software cost estimation.  For those of you not familiar ...
Original Post Date: Friday, February 24, 2012 When software developers first starting writing programs for the Windows ® Operating System it wasn’t pretty.   Everything had to be done from scratch – there was no easy access to tools, libraries and drivers to facilitate development.  A similar tale can be told by the earliest web site developers.   A web application framework is an SDK (Software Development Kit) for web developers.  It is intended to support the development of web services, web applications and dynamic websites.  The framework is intended to increase web development productivity by offering libraries of functionality common ...
Original Post Date: Thursday, February 9, 2012  Model Driven Engineering is a software development methodology focused on creating domain models that abstract the business knowledge and processes of an application domain.  Domain models allow the engineer to pursue a solution to a business problem without considering the eventual platform and implementation technology.  Model Driven Development is a paradigm within Model Driven Engineering that uses models as a primary artifact of the development process using automation to go from models to actual implementation.  Model Driver Architecture is an approach for developing software within the Model Driven Development paradigm.  It was ...
Original Post Date: Monday, December 12, 2011 Check out this paper “The Economics of Community Open Source Software Projects: An Empirical Analysis of Maintenance Effort.”  In it the authors hypothesize that Open Source practices increase the quality of the software that gets produced and subsequently lead to code that is less costly to maintain.  Low quality code must be refactored more frequently than high quality code and there is substantial evidence that maintenance interventions tend to lead to even more degradation of the quality of the code.  So not only are low quality applications more expensive to maintain, the unit ...
Original Post Date: Friday, September 9, 2011 I have recently being following an animated thread on LinkedIn “Death of a Metaphor – Technical Debt.” It’s been live for 2 months with over 200 contributions from dozens of different people.  The discussion was launched by questioning whether continued use of this metaphor makes sense.  The discussion thread weaves and bobs around actually answering this question but it’s amazing how passionate the world is on this topic.  My personal opinion is that it’s a perfectly adequate metaphor because it helps create a discussion between IT and the business leaders in terms ...
Original Post Date: Friday, September 2, 2011  The IEEE published “Top 11 Technologies of the Decade” in the  January 2011 editions of the IEEE Spectrum magazine.  It should come to a surprise to no one that the Smartphone was number 1 on this list.  The answer to the author’s question “Is your phone smarter than a fifth grader” was a resounding YES![1]   In 1983 Motorola introduced the first hand hell cellular phone.  It weighed in at two and a half pounds, had memory capacity for 30 phone numbers, took 10 hours to recharge and had a selling price of $4000 ...