• Predictive Analytics for Improved Cost Management  



Blog



Original Post Date: Monday, December 30, 2013 Unless you live under a rock, you are aware of the healthcare.gov rollout disaster.  While similar IT failures are regularly in the news, the high profile of healthcare.gov has really mainstreamed awareness of the fragility of many IT projects.  Check out this article entitled ‘The Worst IT project disasters of 2013’.  It details IT project failures such as:  IBM’s failure to deliver on a payroll system project that could potentially cost taxpayers up to $1.1 Billion dollars US.    SAP’s failure to deliver satisfactorily on requirements for ...
Original Post Date: Tuesday, December 3, 2013 Forrester defines big data as “the techniques and technologies that make capturing value from data at extreme scales economical”.   Wikipedia defines it as “a collection of data sets so large and complex that it becomes difficult to process using on-hand database management tools or traditional data processing applications.  The challenges include capture, curation, storage, search, sharing, analysis and visualization”.  Many use the 3Vs to describe the characteristics of big data – Volume, Variety and Velocity.   Basically Big Data refers to number crunching of epic proportion, accomplishing in minutes what may have ...
Original Post Date: Wednesday, October 30, 2013 Agile development practices have enabled software development organizations to deliver quality software that optimizes the customer’s satisfaction with the value they receive for their money.  That being said, agile development may not be the best approach for every software development project.  Alistair Cockburn, agile development specialist and one of the initiators of the agile software development movement, acknowledges that “agile is not for every project”.  Further elucidating this point, Cockburn opines:  “small projects, web projects, exploratory projects, agile is fabulous; it beats the pants off of everything else, but for NASA, no”. .  ...
Original Post Date: Friday, October 4, 2013 In my "Work Breakdown Structures are Workable!" blog, we reviewed the subtleties of using the [object] tag to your advantage in creating different sorts and roll up subtotals. As a followup, I’d like to drill down a bit on the initial step of using the “copy grid” exports. Each row number is unique, thus creating an identifying key for the vlookup function in Excel. Since all object X activity instances are allocated 100% to one of the three phases (with very rare exception), these row keys allow you to sort and re-group ...
Original Post Date: Friday, October 4, 2013 True Planning results have many options, including viewing Costs by Activity. While simple, this view can be quite powerful, especially when exported for re-organization manipulation. In a recent exercise, the WBS mapping of common objects, estimated by separate multiple scenarios, presented a non-trivial chore in Excel. “Transposition” features work fine for matrices, as do pivot tables. But how does one map object by activity grids into activity lists, similar to MIL-STD 881a, with singular “roll up” instances of all nonzero object costs? The secret is in how True Planning appends each activity output with the ...
Original Post Date: Friday, October 4, 2013 The late Norm Crosby’s “Quality is Free” taught us that an investment into quality is more than offset by prevention of defects based upon understanding of requirements. Only with the latter can lack of conformance (and subsequent costs) be captured and hence quantified towards quality. So how then is Parametrics relevant? Parametric estimating is more than cost modeling. Our craft represents an initial consulting function into the accuracy and completeness of program planning concepts. Our customers trust us to know when to ask and when to supplement. Yes, we are mathematical and financial modelers ...
Original Post Date: Friday, October 4, 2013 In his August 2010 blog-entry, Zac Jasnoff outlined typical client perspectives for the different types of analyses that True Planning can accommodate.  Working on a large project, we’ve experienced situations that, realistically, can happen where the initial intent... and model structuring… later has the boundaries of model appropriateness stretched.  An AoA, for example, is meant to measure deltas between baseline and its alternatives.  If common costs “wash” then they can be excluded… which becomes an issue when treated as a Rough Order Magnitude for customer budgeting. Likewise, if a ROM or ICE of ...
Original Post Date: Friday, October 4, 2013 I’m not a golfer. But we’ve all heard one say “that’s why I play” after hitting a shot and feeling like it all came together. What “it” is, in terms of mechanics and timing, I’m not really sure. In our own world of parametrics, it’s the feeling of adding value in that golden moment of facilitating decisions and forward momentum. We wear many hats: estimating, consulting, systems engineering...even cost accounting. Building an AoA, ICE or ROM is where rubber-meets-the-road in regards to configurations and assumptions. Not too long ago I was in a discussion with ...
Original Post Date: Friday, October 4, 2013 In Parametrics is Free, I acknowledged receiving (too late) “you should’ve known to ask that” over the years. Quality control after-the-fact is fine; but it’s better and cheaper to take a systematic approach to quality assurance as part of your estimating process. The sheer volume of what we model can often keep us so close to the details that we are unable to step back and put on our QA hat on for a sanity check. Enter Quality! On a very large project, our team has introduced a few regular cross-checks, notwithstanding typical ...
Original Post Date: Friday, October 4, 2013 Like titanium and other exotic metal-materials, “composites” (by definition, combinations of materials) offer significant weight-savings and reduced part counts, but at a price of high production cost. Sound contrarian to our parametric cost estimating view? Not really. Complexity of manufacture is quite higher. Likewise process index and structural tooling values grow. Plus, design lead times drive developmental cycles. That said, understand that composites represent more than a material type. They can involve a highly labor-intensive approach to preparing, braiding/ winding, molding, bonding and modular assemblage. Yes, some aspects of braiding and molding lend themselves ...
Original Post Date: Friday, October 4, 2013 Parametric modeling is excellent for all aspects of early-concept cost estimation, including go/no-go decisions downstream. So, in the spirit of bringing a transparency to (ethical) financial engineering… why not apply our craft to pricing “real-options”? The latter are essentially strategic opportunities for engaging resources (cost/schedule) into projects, ventures, investments, or even abandonments. The opportunity choice has value itself! Unlike static project Net Present Value (often, but not exclusively, approximated with Discounted Cash Flow) assuming pre-defined decisions, real-options reflect the merit of flexibility. If an R&D or proof-of-concept presents viability/marketability learning, the option has positive ...
Original Post Date: Friday, October 4, 2013 Ahhhh, the 80s… a challenging (but often confusing) time in an evolving computing world. Working in 1985 as a software estimator as well as SQA engineer in a quality assurance department that “audited” real-time projects using new concepts like OOD & OOP… well, you get the picture. It was a great time to get immersed into great work. And the good news: that company’s process as well as its developers were bullish on a young estimation/ quality type asking plenty of questions… as long as they were of the Yes-No variety. And ask ...
Original Post Date: Friday, October 4, 2013 ...wear the worst shoes. The cobbler was a master at his craft; he was just too tired to practice it when he got home from the shop. Sound familiar? A disciplined approach to understanding (functional) requirements as well as analogous projects (with actuals) is our not-so-secret sauce. Why run the risk of creeping back up our career learning curve? There’s already enough scope creep to keep us busy. Plus, for you management types charged with prospecting, a consistent approach towards estimation is a great way to connect with people who've felt the pain of ...
Original Post Date: Friday, October 4, 201 My "Real Options Valuation" blog suggested the use of parametrics in real options valuation. I’d like to offer the generalized use of our type of modeling in valuing tangible assets. Typically, fundamental analysis evaluates the intrinsic value of securities. I won’t attempt to compete with Warren Buffet here. But it is certainly the case that a company, or portfolio of securities reflecting many companies, is based in part on the market value of its product assets and their potential for future earnings, as well as other objective and subjective considerations. In parametric estimation, ...
Original Post Date: Wednesday, September 25, 2013 The “Systems Folder” cost object which is found at the start of every TruePlanning Project is most often confused with the “Folder” icon. These two however should not be confused. The “Folder” icon does not have an input sheet at all. It is not a cost object and contains no cost estimating logic or relationships.  It is provided as a collection point so that cost objects can be grouped in ways for clarity like to separate out phases of the acquisition lifecycle or to divide costs between subcontractors, etc.  Whereas, the “System Folder” contains all ...
Original Post Date: Wednesday, September 25, 2013 We may all agree that risk analysis is a necessary, vital part of any valid/defensible cost estimate.  We may not agree as much on the best approach to take to quantify risk in an estimate.  All estimates contain risk.  In the words of a wise cost estimator I know, “That’s why they’re called estimates, and not exactimates!”  We must quantify and manage levels of risk.  Why?  One vital part of a successful program is the ability to build a budget based on reliable cost projections.  Reliability increases when we can analyze inherent risk, ...
Original Post Date: Wednesday, September 25, 2013 A lot of clients have been expressing interest in modeling ASICs, FPGAs, and various other electronic modules inside TruePlanning® (TP). In the release of TruePlanning® 2014 there will now be the capability to model all these products inside our framework. Not only will you be able to model these products but you will of course be able to model the integration cost of these electronic components with Hardware and Software components. In addition you would be able to add and estimate the program management of your total project through our integrated framework. TruePlanning Microcircuits ...
Original Post Date: Wednesday, September 25, 2013 In Government contracting all contracts are made up of a network of suppliers. The Prime contractor who won the overall bid usually has a supply chain of vendors from whom they receive their products and services. In addition they have Subcontractors who provide services under a contracted agreement of work. These vendors and subcontractors most likely have their own network of suppliers which allows for a cost-effective supply chain that extends across America and to other nations. Vendors sell identical or similar products to different customers as part of their regular operations. These ...
Original Post Date: Wednesday, September 25, 2013 “Integration Factors – What Makes TruePlanning™ Stand Out in the Crowd” In today’s world, system integration is becoming more and more important. The government has started asking for designs that have additional capabilities, which allow connectivity both with systems under construction and systems already in use and deployed. The reason systems integration is important is because it adds value to the system by adding abilities that are now possible because of new interactions between subsystems. In a recently posted article on “The True Costs of Integration” the writer defined the costs of a typical integration ...
Original Post Date: Wednesday, September 25, 2013 Introduction/Problem Statement A current client has expressed interest in the default values on the simple vs. the detailed input sheet. More specifically the question arose because this particular customer as well as others had a misconception about the simple vs. the detailed input sheet default values. Most users did not realize that if they were only inputting values on the simple input sheet that the detailed input sheet default values were still be used in the calculation for their cost estimate. So the question became how much are each of these default value inputs ...
Original Post Date: Wednesday, September 25, 2013 These days bidding can be a game, and contractor leadership is constantly making decisions on whether to take on risk in order to stay competitive or to bid conservatively for the safety of not overrunning.  You may complete a cost model for a program, and spend time analyzing the uncertainties behind each input and in the end find that your estimate lands at the 30% confidence level.  After some strategic analysis, the bid leadership team decides, we would like to bid at the 80% Confidence level, “please present your estimate to support that total”.  ...
Original Post Date: Wednesday, September 25, 2013 Background: FPGA design typically uses a library of Tiles (Sets of gates and transistors) and a CAD system to physically lay out the actual active devices on an empty portion of substrate. These devices are interconnected by physical (usually copper) traces to route the signals and perform the desired tasks. The design may be totally customized for a single set of functions and may not need any form of programming. Other designs may allow some parts of the device to be electronically re-programmed to allow the device to be calibrated or adjusted for ...
Original Post Date: Wednesday, September 25, 2013 A current consulting client has expressed interest in modeling scheduled overhauls, above and beyond scheduled maintenance. The latter is well-addressed by TruePlanning. Our challenge is to utilize the model and its calculated lifecycle metrics to estimate the former as well. We have recently developed an approach to address this specific need that I’d like to share with you here. Terminology is important to level-set.  Notwithstanding replacement procurement, restoring equipment readiness via operations & maintenance activities usually falls into one of three funded categories[1]:  Inspection/ Repair at the organizational and/or direct support ...
Original Post Date: Wednesday, September 25, 2013 As promised in my last blog (“System and Assembly Objects in the context of Hardware AoAs”), the integration of Software COTS is a subtly different challenge.  Customers are typically presented with two scenarios:  integration within a single system and integration within a system of systems (“SoS”).  Both cases are handled well by TruePlanning™ with specific parameter choices that control multiple activities relevant to software integration.  As I’ve come to appreciate PRICE’s competitive advantage with our Framework’s approach to the latter SoS case, I thought describing these two scenarios would help allow you to ...
Original Post Date: Wednesday, September 25, 2013 During a recent Analysis of Alternatives (“AoA”) consulting project, our customer asked that we provide more insight into TruePlanning’s System and Assembly objects, which in our AoA context we termed Systems Engineering/ Program Management (SE/PM) and Alternative Integration/ Test, respectively. The customer’s challenge was understanding our parametric model’s treatment of principally hardware-COTS objects, combined with other cost, purchased service and training objects.Our Chief Scientist, Arlene Minkiewicz, provided us with insights that I’d like to share with you, as well as my views on how we at PRICE systems have consistently used these parent ...
Original Post Date: Wednesday, September 25, 2013 It is impossible to find a news website or magazine that is not full of articles on the effects of Sequestration.  As a cost estimator, I find the topic very interesting (and troublesome).  The immediate effects of Sequestration are widely discussed.  However, I do not see quite as much news coverage on the second and third order effects of this extremely complex policy decision. The Department of Defense (DoD) has a specific target that must be removed from the budget over the next 10 years.  Some analysts claim a doomsday scenario.  Others claim it ...
Original Post Date: Wednesday, September 25, 2013 In a recent National Public Radio (NPR) interview, Winslow Wheeler (Director of the Straus Military Reform Project of the Project on Government Oversight in Washington, D.C.), spoke on the recent problems with the Joint Strike Fighter acquisition process.  “Wheeler explained that Lockheed Martin, the manufacturer of the jet, uses a pricing vocabulary that masks costs. ‘Flyaway costs, non-recurring and recurring costs, and lots of gobbledygook, and they’ll say that comes to a number like $60-$70 million dollars. And, it’s complete baloney,’ said Wheeler.” (pogo.org)    The F-35 has the distinction of being the most ...
Original Post Date: Wednesday, September 25, 2013 “On 29 July 2003, the Acting Under Secretary of Defense (Acquisition, Technology and Logistics) signed a policy memorandum entitled “Policy for Unique Identification (UID) of Tangible Items – New Equipment, Major Modifications, and Reprocurements of Equipment and Spares”. This Policy made UID a mandatory DoD requirement on all new equipment and materiel delivered pursuant to solicitations issued on or after January 1, 2004. USD(AT&L) issued verbal guidance that tangible assets manufactured by DoD’s organic depots were to be considered “new” items which fall under UID marking policy, beginning 1 January, 2005. An item is considered “significant”, and will be uniquely ...
Original Post Date: Wednesday, September 25, 2013 What follows is PRICE's interpretation of how to model FPGA’s and ASICs inside our current version of TruePlanning® 2012 SR2 and older versions. ASICs are an application-specific integrated circuit, customized for a particular use. FPGAs are Field-programmable gate arrays and are considered the modern-day technology that can be used in many different applications because of the programmable logical blocks. In industry ASICs has grown from 5,000 gates to over 100 million. Designers of digital ASICs use a VHDL language for the functionality.   Estimating Best Practice for TruePlanning: Model your electronics at the board level ...
Original Post Date: Wednesday, September 25, 2013 Last November I hosted a webinar that discussed the use of the companion applications Live! This session helped to further explain how to use them in congruence with TP, the history, & why we created them, etc. During the presentation I showcased the success’s I have encountered with using them both in the recent AOA I described in part 2 and this blog part 3. You can find the recorded webinar on our site. In addition I described as I am going to do here the differences between the large project engine and the excel ...
Original Post Date: Wednesday, September 25, 2013 Every day we use tools like TruePlanning to build up detailed parametric cost estimates.  We could spend weeks collecting data and design information, and weeks honing details on risks and uncertainties.  When we finally get to reasonable point estimate, or even a distribution of probable estimates, there are always more questions.  Of course the range of queries depends on the purpose of the estimate, and who your consumer is.  If you are preparing an estimate for a competitive proposal, a company executive may be your consumer.  They may want to know, “What is the ...
Original Post Date: Wednesday, September 25, 2013 After we spend time building up a point estimate, a cost estimator always has to do some additional work to break the estimate down into terms their estimate consumer will understand, or to perform different comparisons for various analyses.  Sometimes, you need to map the estimate into a Standard Work Breakdown Structure or Cost Element Structure.  Sometimes you want to compare to a bottom-up or grass-roots estimate.  Or, if you are planning budgets or manpower out into the future, you need details.  We have to speak a lot of languages in order to ...
Original Post Date: Wednesday, September 25, 2013 I’ve recently had a number of users ask, “How do I model life cycle costs for a missile that just sits on a shelf?”  I had never actually tried to model this, but of course I know it’s possible.  So I turned to some of my fellow PRICE experts, and found that of course this is not the first time anyone has ever tried to model this kind of thing… Many ordnance weapons such as mortar shells, torpedoes, bombs, missiles and various projectiles are stockpiled until they are actually needed. These weapons ...
Original Post Date: Wednesday, September 25, 2013 We have been blogging a lot lately about finding the right manufacturing complexities for your hardware  projects using TrueFindings, and the many ways you can determine the most appropriate Manufacturing complexities: taking averages from related projects, finding a correlation to another descriptive parameter (like horsepower for engines), or even looking into multi-variate regression to determine the best value.  If you are a TruePlanning user, you know that Manufacturing Complexities (for structure or electronics) are the most common parameter to use in calibration, and that the manufacturing complexity drives both Production and development ...
Original Post Date: Wednesday, September 25, 2013 We all pull data and research from various sources when creating a project estimate.  You may pull together public CERS, internal research, subscription based data or commercial models.  In the end you want your entire estimate in one format.  If you use TruePlanning, you may have used the “Other Direct Cost Object” in the past to include costs estimated in another model.  You may have utilized the “Equation Cost Object” to include a CER with up to 5 variables, which would allow you to account for the size and complexity of an ...
Original Post Date: Tuesday, September 24, 201 Risk Analysis Methodology Overview – Method of Moments In this second of three articles on risk analysis, we will discuss Method of Moments.  Method of Moments (MoM) is an alternative to Monte Carlo simulation.  Along with the methodology, we will present some pros and cons of using MoM over Monte Carlo.  What is a moment? Before we discuss the methodology behind MoM, we first need to talk about moments.  Caution:  for all the master statisticians out there, this article is meant to boil down complex topics in an easy to understand manner.  There are obviously ...
Original Post Date: Thursday, February 21, 2013 I recently attended a webinar presented by David Herron of the David Consulting Group (DCG) discussing a recently released specification for the automation of function point counting (available on the Consortium for IT Software Quality (CISQ) site .  Function point counting is a process through which software ‘size’ is measured by the amount of business value that the software delivers to the end user.  Function Point counts are thought by many to be a far superior means of measuring software ‘size’ because they are technology neutral and not impacted by factors such as ...
Original Post Date: Wednesday, January 9, 2013 While TruePlanning by PRICE Systems is the world leading parametric cost estimation tool, the reality is that not all members in an organization are going to have TruePlanning installed on their desktops, but that doesn’t mean those people can’t get a view into the power of TruePlanning. The TruePlanning Viewer is a tool under development that allows users to view the PBS and data contained in a TruePlanning project without having TruePlanning on their desktops. Figure 1 TruePlanning Viewer TruePlanning users will be able to create special “tpview” files that non-TruePlanning users can use with ...
Original Post Date: Tuesday, January 8, 2013 The preliminary results are in for our electronics complexity study!   For those that are unfamiliar with TruePlanning® lingo, Manufacturing Complexity for Electronics is our scale for measuring the differences in technology, producibility (component make-up, packaging density, test and reliability requirements, etc.) and yield of the electronics being estimated.  As the complexity number gets bigger, it means the electronics are more complex, and more costly (per unit weight) to develop and manufacture.   At first glance, the general trend from lowest complexity to highest complexity seems about right.  The lower complexity items are found on ...
Original Post Date: Tuesday, December 18, 2012 One of the biggest challenges estimators face is defending their estimates.  You may trust in your estimate, but how do you get others on board who might be unfamiliar with parametric estimating?  Showing comparisons of your project to similar completed projects is one of the best methods of defending your choice of inputs and your final results.  It’s also a method that nearly everyone understands.  Unfortunately, relevant, high quality data to compare with isn’t always available. There are 2 important trends related to this problem.  First, high quality data is being protected more so than ...
Original Post Date: Thursday, November 29, 2012 With Customize View, TruePlanning™ offers great flexibility in editing favorites as row by column grids.  Typical examples are Object by Activity, Object by Resource and Activity by Resource.  But what if you wanted to view any of these grids also by year?  In other words, how can we add a third dimension?  Last October, in my “Work Breakdown Structures are Workable!” blog, we discussed the use of Excel’s Data-Sort and Data-Group options to build up a WBS, for a MIL-STD 881 style view of Activity by Object as a linear list.  Results were singular ...
Original Post Date: Thursday, November 1, 2012  Changing organization culture is often cited as necessary for surviving the grim financial realities we face today. Everywhere you look, no one seems to have enough money to buy what they need, but somehow, the need must be fulfilled. If we can just change organization culture, a smart new idea will emerge to save the day; maybe so. How, then, do we change the culture? Practically every B-school and business periodical has written on the subject. One thing they all agree is that achieving organization change is one of, if not the ...
Original Post Date: Wednesday, October 17, 2012 PRICE was recently tasked by a client to provide an Analysis of Alternatives (AOA). As we assembled a team to complete this, I was tasked with modeling all the alternatives inside TruePlanning.  After the initial data call, we realized that this project would be cumbersome due to the large amount of data. While developing the project plan, I had to think of a crafty way to get data in and out of TruePlanning efficiently. It was interesting to note how much capability I could utilize from the TruePlanning Companion Applications to effectively support ...
Original Post Date: Wednesday, October 17, 2012 A frequent question from students and consulting clients is how to estimate software size when either: detailed functional requirements descriptions are not yet documented or, even if the latter do exist, the resources necessary (in cost and time) for detailed function point (“FP”) counting are prohibitive. If appropriate analogies or detailed use cases are not available, fast function point counting can be a non-starter, without nominal understanding of pre-design software transactions and data functions.  Hence, the challenge is to find an estimating basis for functional measure (i.e., ...
Original Post Date: Thursday, October 11, 2012  Over the past year and half of customer mentoring, I have been responding to more and more requests regarding how to represent the DoD Acquisition Phases in the development of TruePlanning® cost estimates. With the renewed interest in Total Ownership Costs, there appears to be a desire to have greater visibility into costs by appropriation over a well-defined / understood schedule. This need to estimate and report out on cost by appropriation and schedule has been a driver behind the need to represent the acquisition phases more explicitly within TruePlanning® than is ...
Original Post date: Wednesday, October 10, 2012 Deciding whether Excel® is a friend or foe is a hefty topic, so I decided to dedicate several blog posts to the issue!  This first posts addresses all of PRICE’s new boundless Companion Applications.  The second will address my experience using the applications with our customers (Do’s and Don’ts); and lastly, the third and final blog will wrap it up and explain a more in-depth large project engine that PRICE is currently testing.  As we all know Microsoft Excel® is a powerhouse tool.  It allows you to house data, format it in many ...
Original Post Date: Wednesday, October 10, 2012 Recently I wrote a blog on the “Role of Value Engineering in Affordability Analysis.” In that blog, I wrote about the importance of understanding the cost behavior of each candidate's (alternative) architecture as part of the Value Engineering method in order to achieve affordability. I defined “cost behavior” as the relationship of how cost varies as design factors such as new materials, cutting edge technology, new manufacturing processes, and extensive support requirements associated with a particular function causes a cost to change. What are the drivers and how does cost change as those ...
Original Post Date: Monday, October 8, 2012 The answer:  ~ 10.226.  At least that’s the value on our complexity scale we found via calibration after modeling it in TruePlanning(R).   Check out this Teardown of the new iPhone 5 , which breaks it down into a bill of materials, each with an estimated cost.  A colleague had the cool idea to model the iPhone 5 with TruePlanning, using information we could find on the internet.  This was a really thought-provoking exercise to help me as I’m updating our electronics complexity guidance, because the electronics in the iPhone 5 are state-of-the-art. Around ...
Original Post Date: Friday, October 5, 2012 I am currently involved in co-authoring a white paper on the “Role of Value Engineering in Affordability Analysis.” For the purposes of this discussion I define affordability as that characteristic of a product or service that responds to the buyer’s price, performance, and availability needs simultaneously. The writing of this white paper has been an interesting exercise for me because my fellow co-authors come from different backgrounds and thus have very different points of view. The majority of my colleagues are “card carrying” Systems Engineers. As such, they have a perspective that ...
Original Post Date: Thursday, September 27, 201 I am frequently questioned by clients and prospects about the applicability of PRICE’s parametric software estimation model to agile software development projects.  There are several ways one could respond to this.  My first thought is that if a shop is truly agile, they don’t need an estimation tool.  They know their development team velocity because agile teams are committed to measurement.  They also either know when they need to make a delivery – in which case whatever amount of software they’ve build by that point will be released.  Alternatively they may know ...