• Predictive Analytics for Improved Cost Management



Blog



Original Post Date: Tuesday, December 18, 2012 One of the biggest challenges estimators face is defending their estimates.  You may trust in your estimate, but how do you get others on board who might be unfamiliar with parametric estimating?  Showing comparisons of your project to similar completed projects is one of the best methods of defending your choice of inputs and your final results.  It’s also a method that nearly everyone understands.  Unfortunately, relevant, high quality data to compare with isn’t always available. There are 2 important trends related to this problem.  First, high quality data is being protected more so than ...
Original Post Date: Thursday, November 29, 2012 With Customize View, TruePlanning™ offers great flexibility in editing favorites as row by column grids.  Typical examples are Object by Activity, Object by Resource and Activity by Resource.  But what if you wanted to view any of these grids also by year?  In other words, how can we add a third dimension?  Last October, in my “Work Breakdown Structures are Workable!” blog, we discussed the use of Excel’s Data-Sort and Data-Group options to build up a WBS, for a MIL-STD 881 style view of Activity by Object as a linear list.  Results were singular ...
Original Post Date: Thursday, November 1, 2012  Changing organization culture is often cited as necessary for surviving the grim financial realities we face today. Everywhere you look, no one seems to have enough money to buy what they need, but somehow, the need must be fulfilled. If we can just change organization culture, a smart new idea will emerge to save the day; maybe so. How, then, do we change the culture? Practically every B-school and business periodical has written on the subject. One thing they all agree is that achieving organization change is one of, if not the ...
Original Post Date: Wednesday, October 17, 2012 PRICE was recently tasked by a client to provide an Analysis of Alternatives (AOA). As we assembled a team to complete this, I was tasked with modeling all the alternatives inside TruePlanning.  After the initial data call, we realized that this project would be cumbersome due to the large amount of data. While developing the project plan, I had to think of a crafty way to get data in and out of TruePlanning efficiently. It was interesting to note how much capability I could utilize from the TruePlanning Companion Applications to effectively support ...
Original Post Date: Wednesday, October 17, 2012 A frequent question from students and consulting clients is how to estimate software size when either: detailed functional requirements descriptions are not yet documented or, even if the latter do exist, the resources necessary (in cost and time) for detailed function point (“FP”) counting are prohibitive. If appropriate analogies or detailed use cases are not available, fast function point counting can be a non-starter, without nominal understanding of pre-design software transactions and data functions.  Hence, the challenge is to find an estimating basis for functional measure (i.e., ...
Original Post Date: Thursday, October 11, 2012  Over the past year and half of customer mentoring, I have been responding to more and more requests regarding how to represent the DoD Acquisition Phases in the development of TruePlanning® cost estimates. With the renewed interest in Total Ownership Costs, there appears to be a desire to have greater visibility into costs by appropriation over a well-defined / understood schedule. This need to estimate and report out on cost by appropriation and schedule has been a driver behind the need to represent the acquisition phases more explicitly within TruePlanning® than is ...
Original Post Date: Wednesday, October 10, 2012 Recently I wrote a blog on the “Role of Value Engineering in Affordability Analysis.” In that blog, I wrote about the importance of understanding the cost behavior of each candidate's (alternative) architecture as part of the Value Engineering method in order to achieve affordability. I defined “cost behavior” as the relationship of how cost varies as design factors such as new materials, cutting edge technology, new manufacturing processes, and extensive support requirements associated with a particular function causes a cost to change. What are the drivers and how does cost change as those ...
Original Post Date: Monday, October 8, 2012 The answer:  ~ 10.226.  At least that’s the value on our complexity scale we found via calibration after modeling it in TruePlanning(R).   Check out this Teardown of the new iPhone 5 , which breaks it down into a bill of materials, each with an estimated cost.  A colleague had the cool idea to model the iPhone 5 with TruePlanning, using information we could find on the internet.  This was a really thought-provoking exercise to help me as I’m updating our electronics complexity guidance, because the electronics in the iPhone 5 are state-of-the-art. Around ...
Original Post Date: Friday, October 5, 2012 I am currently involved in co-authoring a white paper on the “Role of Value Engineering in Affordability Analysis.” For the purposes of this discussion I define affordability as that characteristic of a product or service that responds to the buyer’s price, performance, and availability needs simultaneously. The writing of this white paper has been an interesting exercise for me because my fellow co-authors come from different backgrounds and thus have very different points of view. The majority of my colleagues are “card carrying” Systems Engineers. As such, they have a perspective that ...
Original Post Date: Thursday, September 27, 201 I am frequently questioned by clients and prospects about the applicability of PRICE’s parametric software estimation model to agile software development projects.  There are several ways one could respond to this.  My first thought is that if a shop is truly agile, they don’t need an estimation tool.  They know their development team velocity because agile teams are committed to measurement.  They also either know when they need to make a delivery – in which case whatever amount of software they’ve build by that point will be released.  Alternatively they may know ...
Original Post Date: Tuesday, October 2, 2012 This past year PRICE Systems has entered into a partnership with the International Benchmark Standards Group (ISBSG).  As part of this partnership we have a corporate subscription to both of their databases – the Development and Enhancement Database and the Maintenance and Support Database.  We can use these for analysis and to develop metrics that will help TruePlanning users be better software estimators.  The ISBSG is one of the oldest and most trusted sources for software project data.  They are a not for profit organization dedicated to improving software measurement at an international ...
Original Post Date: Tuesday, October 2, 2012 We’re building a tool that quickly and easily allows you to map costs in your TruePlanning(R) projects to any custom Work Breakdown Structure (WBS), including templates for the MIL-STD-881C.  By mapping TruePlanning costs to fit your point of view, you can make use of your organization’s existing research, and ensure that your cost estimates are complete, realistic, and can be easily compared to other projects apples-to-apples.  This is great for AoAs and analyzing where and why costs differ between various solutions (not to mention 881C mapping is a required deliverable for major ...
Original Post Date: Thursday, September 27, 2012 I am frequently questioned by clients and prospects about the applicability of PRICE’s parametric software estimation model to agile software development projects.  There are several ways one could respond to this.  My first thought is that if a shop is truly agile, they don’t need an estimation tool.  They know their development team velocity because agile teams are committed to measurement.  They also either know when they need to make a delivery – in which case whatever amount of software they’ve build by that point will be released.  Alternatively they may know a ...
Original Post Date: Monday, September 24, 2012 The purpose of this blog is to describe the role Value Engineering plays within the affordability process. The blog is not a step by step “How To Conduct or Execute” Value Engineering (VE) but, it is a discussion of the context, input, setup, execution hints, and output of Value Engineering in support of conducting affordability analysis and management. As such, it is important to understand the concept of affordability within the Systems Engineering paradigm. This blog is designed to provide insights, lessons learned, and suggestions for using Value Engineering in the affordability process. In ...
Original Post Date: Friday, September 14, 2011 Check out this article about the Defense Information Systems Agency (DISA) and their cloud computing strategy.  With the DOD’s ever increasing focus on affordability moving eligible capabilities to the cloud is an excellent plan for the government.  DISAs strategy includes the consolidation of data centers and network operations centers and the migration of 1.4 million Army email accounts to the cloud.  Cloud computing allows organizations to utilize applications, platforms and hardware through the Internet (or some other network) rather than having to purchase or lease these items.  Cloud computing offers opportunities for cost ...
Original Post Date: Friday, September 7, 2012 So, what goes on out here in the corn fields, anyway?  Did you know Dayton, Ohio is home to the Air Force Materiel Command?  Air Force Materiel Command (AFMC), headquartered at Wright-Patterson AFB, Ohio, “develops, acquires and sustains the aerospace power needed to defend the United States and its interests for today and tomorrow. This is accomplished through management, research, acquisition, development, testing and maintenance of existing and future weapons systems and their components” (afmc.af.mil).   In response to future budget uncertainty and a challenge by Congress to operate more efficiently, AFMC recently ...
Original Post Date: Thursday, August 23, 2012 We’ve bought access to IHS Haystack, which links us to huge amounts of data on government parts and logistics, including electronics running the gamut of technologies, functions, and types of equipment.  The tool combines data from more than 70 military and commercial databases, and includes detailed data on costs, procurement histories, and a wide range of technical characteristics – perfect for building and validating CERs. This data will be very useful for our TP 2013 Electronics Update.  All of this data can be built into a knowledge base and made available to TruePlanning users.  With this ...
Original Post Date: Tuesday, July 3, 2012  Introduction Cost estimate models are developed for many reasons: bids and proposals, should cost analysis, or measuring the state of a project already underway. Frequently many estimates are created using the same sources of data in an organization. By creating a custom integration solution based on the TruePlanning COM API repeated model creation can be automated. This can save considerable time, significantly increase the quality of the estimates by removing hand typed data errors, and speed up the time it takes to produce and gain value from cost estimate models. By example, this blog ...
Original Post Date: Tuesday, July 3, 2012  Introduction A cost estimation model needs to be fed data, and it is only as good as data used to create it. Frequently the data needed to ‘feed’ the cost model comes from a variety of sources including engineers, subject matter experts or other individuals not directly building the cost model. Just asking these experts to stick their finger in the air and take a guess isn’t always the best approach. Using the COM enabled Data Input Form Excel solution that comes with TruePlanning can help users obtain the data needed to complete ...
Original Post Date: Tuesday, July 3, 2012 Introduction @RISK and Crystal Ball are two Excel based applications that allow users to perform uncertainty, sensitivity or risk analysis on data contained in Excel spreadsheets. The analysis can be performed using various techniques including Monte Carlo and Latin Hypercube.  Many TruePlanning users are interested in performing this type of analysis on the models they create in TruePlanning or may even have requirements to perform this type of analysis on their estimates.   In response to this desire, PRICE Systems L.L.C. has created two Excel based solutions that allow users to easily leverage the ...
Original Post Date: Monday, July 2, 2012  The COSMIC method for counting function points arose out of concerns that the IFPUG (NESMA, FisMA) function points are too concerned with data intense business systems and subsequently are not adequate for adequately measuring the size of real time systems.   The COSMIC function point counting method has been designed to be applicable to both business systems such as banking, insurance, etc and real time software such as telephone exchanges and embedded systems such as those found in automobiles and aircraft.  The COSMIC method uses the Functional User Requirements as the basis for the ...
Original Post Date: Tuesday, June 19, 2012 In my last blog, I talked about the major research study on electronics being undertaken by the PRICE cost research team this year.  So far, we have visited modern electronics facilities, interviewed electronics experts, and visited customer sites to discuss their electronics estimating challenges. Among other things, we want to revisit and improve our “Manufacturing Complexity for Electronics” calculator.  This calculator guides users through a process of describing the electronic components being modeled, in a way that helps them quantify the complexity.  The first steps involve describing the equipment type, and technologies used.  ...
Original Post Date: Monday, June 18, 2012 This week I’m attending the Better Software Conference in Vegas.   I just attended a great keynote given by Patrick Copeland of Google.  The topic was innovation.  He talked about how innovators beat ideas, prototypes beat prototypes and data beats opinions.  These sentiments are all part of the pretotyping manifesto. He started with the truth that most new products and services fail and proposed that while this is not unexpected, there is a good way to fail and a bad way to fail.  The good way to fail is to fail fast.  ...
Original Post Date: Tuesday, June 5, 2012 Ever wonder what programming languages are the most productive?  I recently did a little research into this topic using the International Software Benchmark Standards Group (ISBSG) database. The database contains over 5000 data points with size and effort data for projects from a wide variety of industries, applications and counties.  Of course not all 5000 data points were suitable for my investigation.  The software size is measured using functional size metrics but the database accepts projects that use various counting methods.  I narrowed my search to projects that used the International Function Points ...
Original Post Date: Monday, April 16, 2012 This year, the PRICE cost research team is kicking off a major research study on electronics.  The field of electronics changes very rapidly, and we want to ensure our estimation methods are well suited for the latest technology.  While we constantly collect data and update relationships, this study will go above and beyond the norm.  We'll visit modern electronics facilities, interview experts, visit customers to discuss their electronics estimating challenges, and start a major data collection and analysis effort.  In the end, we plan to add many new electronic classifications, reexamine our ...
Original Post Date: Thursday, April 5, 2012  Introduction The previous blog covered getting started with the TruePlanning API using VBA. Now it is time to tackle collections. Collections in the TruePlanning API are simply objects that hold a list of objects where all the objects are of the same type.  For example, the Project object contains a collection of the Cost Objects that belong to the project.  In order to use the TruePlanning API, mastering collections is essential, but “mastering” is a bit strong a word given that they are easy to use. By the end of this blog TruePlanning API ...
Original Post Date: Monday, April 2, 2012 In my previous blog, I introduced the relationship between three “breakdown structures” commonly used in project management, and how cost estimates are linked to them and aid in their usefulness.  In this blog, I’ll dig further into these relationships, and explain their impact on my Total Ownership Cost (TOC) solution in TruePlanning.   Let’s use an example of an aircraft being built for the Army.  In this hypothetical example, the prime contractor is Boeing, and they have different departments working on various parts of the aircraft.  Department 10 is responsible for the wings and ...
Original Post Date: Thursday, March 29, 2012 Introduction The new TruePlanning COM Application Programming Interface (API) found in the 2012SR1 release of TruePlanning provides a powerful mechanism for leveraging the power of TruePlanning within custom solutions. Through the COM API TruePlanning projects can be created, updated, calculated and saved allowing for near limitless potential for integration with TruePlanning.  That said it is an “API” which means some programming will need to be done. This discussion is focused on how to get started using the TruePlanning COM API. Development Environments The TruePlanning COM API is written in C++, but is available to any programming ...
Original Post Date: Monday, March 26, 2012 In my research on project management and cost estimation, I often come across three different “breakdown structures” which are useful in dissecting a project from various points of view.  The work breakdown structure (WBS) is oriented around project deliverables; it breaks a system down into subsystems, components, tasks and work packages.  The organization breakdown structure (OBS) shows the structure of the organizations involved in the project, including how these organizations break down into sites, divisions, teams, etc.  Finally, the cost breakdown structure (CBS) examines a project in terms of useful cost categories, ...
Original Post Date: Friday, March 23, 2012 Recently I have been playing around with the International Software Benchmark Standards (ISBSG) database for Development and Enhancement projects.  And in the interest of full disclosure I admit that I am more than a little excited to have close to 6000 data points at my fingertips.  I will further admit that there’s something quite daunting about having this much data; where to start, what should I be looking for, how can I best use this data to offer some useful guidance to inform software cost estimation.  For those of you not familiar ...
Original Post Date: Friday, February 24, 2012 When software developers first starting writing programs for the Windows ® Operating System it wasn’t pretty.   Everything had to be done from scratch – there was no easy access to tools, libraries and drivers to facilitate development.  A similar tale can be told by the earliest web site developers.   A web application framework is an SDK (Software Development Kit) for web developers.  It is intended to support the development of web services, web applications and dynamic websites.  The framework is intended to increase web development productivity by offering libraries of functionality common ...
Original Post Date: Thursday, February 9, 2012  Model Driven Engineering is a software development methodology focused on creating domain models that abstract the business knowledge and processes of an application domain.  Domain models allow the engineer to pursue a solution to a business problem without considering the eventual platform and implementation technology.  Model Driven Development is a paradigm within Model Driven Engineering that uses models as a primary artifact of the development process using automation to go from models to actual implementation.  Model Driver Architecture is an approach for developing software within the Model Driven Development paradigm.  It was ...