• Predictive Analytics for Improved Cost Management



Blog



Original Post Date: Friday, October 5, 2012 I am currently involved in co-authoring a white paper on the “Role of Value Engineering in Affordability Analysis.” For the purposes of this discussion I define affordability as that characteristic of a product or service that responds to the buyer’s price, performance, and availability needs simultaneously. The writing of this white paper has been an interesting exercise for me because my fellow co-authors come from different backgrounds and thus have very different points of view. The majority of my colleagues are “card carrying” Systems Engineers. As such, they have a perspective that ...
Original Post Date: Thursday, September 27, 201 I am frequently questioned by clients and prospects about the applicability of PRICE’s parametric software estimation model to agile software development projects.  There are several ways one could respond to this.  My first thought is that if a shop is truly agile, they don’t need an estimation tool.  They know their development team velocity because agile teams are committed to measurement.  They also either know when they need to make a delivery – in which case whatever amount of software they’ve build by that point will be released.  Alternatively they may know ...
Original Post Date: Tuesday, October 2, 2012 This past year PRICE Systems has entered into a partnership with the International Benchmark Standards Group (ISBSG).  As part of this partnership we have a corporate subscription to both of their databases – the Development and Enhancement Database and the Maintenance and Support Database.  We can use these for analysis and to develop metrics that will help TruePlanning users be better software estimators.  The ISBSG is one of the oldest and most trusted sources for software project data.  They are a not for profit organization dedicated to improving software measurement at an international ...
Original Post Date: Tuesday, October 2, 2012 We’re building a tool that quickly and easily allows you to map costs in your TruePlanning(R) projects to any custom Work Breakdown Structure (WBS), including templates for the MIL-STD-881C.  By mapping TruePlanning costs to fit your point of view, you can make use of your organization’s existing research, and ensure that your cost estimates are complete, realistic, and can be easily compared to other projects apples-to-apples.  This is great for AoAs and analyzing where and why costs differ between various solutions (not to mention 881C mapping is a required deliverable for major ...
Original Post Date: Thursday, September 27, 2012 I am frequently questioned by clients and prospects about the applicability of PRICE’s parametric software estimation model to agile software development projects.  There are several ways one could respond to this.  My first thought is that if a shop is truly agile, they don’t need an estimation tool.  They know their development team velocity because agile teams are committed to measurement.  They also either know when they need to make a delivery – in which case whatever amount of software they’ve build by that point will be released.  Alternatively they may know a ...
Original Post Date: Monday, September 24, 2012 The purpose of this blog is to describe the role Value Engineering plays within the affordability process. The blog is not a step by step “How To Conduct or Execute” Value Engineering (VE) but, it is a discussion of the context, input, setup, execution hints, and output of Value Engineering in support of conducting affordability analysis and management. As such, it is important to understand the concept of affordability within the Systems Engineering paradigm. This blog is designed to provide insights, lessons learned, and suggestions for using Value Engineering in the affordability process. In ...
Original Post Date: Friday, September 14, 2011 Check out this article about the Defense Information Systems Agency (DISA) and their cloud computing strategy.  With the DOD’s ever increasing focus on affordability moving eligible capabilities to the cloud is an excellent plan for the government.  DISAs strategy includes the consolidation of data centers and network operations centers and the migration of 1.4 million Army email accounts to the cloud.  Cloud computing allows organizations to utilize applications, platforms and hardware through the Internet (or some other network) rather than having to purchase or lease these items.  Cloud computing offers opportunities for cost ...
Original Post Date: Friday, September 7, 2012 So, what goes on out here in the corn fields, anyway?  Did you know Dayton, Ohio is home to the Air Force Materiel Command?  Air Force Materiel Command (AFMC), headquartered at Wright-Patterson AFB, Ohio, “develops, acquires and sustains the aerospace power needed to defend the United States and its interests for today and tomorrow. This is accomplished through management, research, acquisition, development, testing and maintenance of existing and future weapons systems and their components” (afmc.af.mil).   In response to future budget uncertainty and a challenge by Congress to operate more efficiently, AFMC recently ...
Original Post Date: Thursday, August 23, 2012 We’ve bought access to IHS Haystack, which links us to huge amounts of data on government parts and logistics, including electronics running the gamut of technologies, functions, and types of equipment.  The tool combines data from more than 70 military and commercial databases, and includes detailed data on costs, procurement histories, and a wide range of technical characteristics – perfect for building and validating CERs. This data will be very useful for our TP 2013 Electronics Update.  All of this data can be built into a knowledge base and made available to TruePlanning users.  With this ...
Original Post Date: Tuesday, July 3, 2012  Introduction Cost estimate models are developed for many reasons: bids and proposals, should cost analysis, or measuring the state of a project already underway. Frequently many estimates are created using the same sources of data in an organization. By creating a custom integration solution based on the TruePlanning COM API repeated model creation can be automated. This can save considerable time, significantly increase the quality of the estimates by removing hand typed data errors, and speed up the time it takes to produce and gain value from cost estimate models. By example, this blog ...
Original Post Date: Tuesday, July 3, 2012  Introduction A cost estimation model needs to be fed data, and it is only as good as data used to create it. Frequently the data needed to ‘feed’ the cost model comes from a variety of sources including engineers, subject matter experts or other individuals not directly building the cost model. Just asking these experts to stick their finger in the air and take a guess isn’t always the best approach. Using the COM enabled Data Input Form Excel solution that comes with TruePlanning can help users obtain the data needed to complete ...
Original Post Date: Tuesday, July 3, 2012 Introduction @RISK and Crystal Ball are two Excel based applications that allow users to perform uncertainty, sensitivity or risk analysis on data contained in Excel spreadsheets. The analysis can be performed using various techniques including Monte Carlo and Latin Hypercube.  Many TruePlanning users are interested in performing this type of analysis on the models they create in TruePlanning or may even have requirements to perform this type of analysis on their estimates.   In response to this desire, PRICE Systems L.L.C. has created two Excel based solutions that allow users to easily leverage the ...
Original Post Date: Monday, July 2, 2012  The COSMIC method for counting function points arose out of concerns that the IFPUG (NESMA, FisMA) function points are too concerned with data intense business systems and subsequently are not adequate for adequately measuring the size of real time systems.   The COSMIC function point counting method has been designed to be applicable to both business systems such as banking, insurance, etc and real time software such as telephone exchanges and embedded systems such as those found in automobiles and aircraft.  The COSMIC method uses the Functional User Requirements as the basis for the ...
Original Post Date: Tuesday, June 19, 2012 In my last blog, I talked about the major research study on electronics being undertaken by the PRICE cost research team this year.  So far, we have visited modern electronics facilities, interviewed electronics experts, and visited customer sites to discuss their electronics estimating challenges. Among other things, we want to revisit and improve our “Manufacturing Complexity for Electronics” calculator.  This calculator guides users through a process of describing the electronic components being modeled, in a way that helps them quantify the complexity.  The first steps involve describing the equipment type, and technologies used.  ...
Original Post Date: Monday, June 18, 2012 This week I’m attending the Better Software Conference in Vegas.   I just attended a great keynote given by Patrick Copeland of Google.  The topic was innovation.  He talked about how innovators beat ideas, prototypes beat prototypes and data beats opinions.  These sentiments are all part of the pretotyping manifesto. He started with the truth that most new products and services fail and proposed that while this is not unexpected, there is a good way to fail and a bad way to fail.  The good way to fail is to fail fast.  ...
Original Post Date: Tuesday, June 5, 2012 Ever wonder what programming languages are the most productive?  I recently did a little research into this topic using the International Software Benchmark Standards Group (ISBSG) database. The database contains over 5000 data points with size and effort data for projects from a wide variety of industries, applications and counties.  Of course not all 5000 data points were suitable for my investigation.  The software size is measured using functional size metrics but the database accepts projects that use various counting methods.  I narrowed my search to projects that used the International Function Points ...
Original Post Date: Monday, April 16, 2012 This year, the PRICE cost research team is kicking off a major research study on electronics.  The field of electronics changes very rapidly, and we want to ensure our estimation methods are well suited for the latest technology.  While we constantly collect data and update relationships, this study will go above and beyond the norm.  We'll visit modern electronics facilities, interview experts, visit customers to discuss their electronics estimating challenges, and start a major data collection and analysis effort.  In the end, we plan to add many new electronic classifications, reexamine our ...
Original Post Date: Thursday, April 5, 2012  Introduction The previous blog covered getting started with the TruePlanning API using VBA. Now it is time to tackle collections. Collections in the TruePlanning API are simply objects that hold a list of objects where all the objects are of the same type.  For example, the Project object contains a collection of the Cost Objects that belong to the project.  In order to use the TruePlanning API, mastering collections is essential, but “mastering” is a bit strong a word given that they are easy to use. By the end of this blog TruePlanning API ...
Original Post Date: Monday, April 2, 2012 In my previous blog, I introduced the relationship between three “breakdown structures” commonly used in project management, and how cost estimates are linked to them and aid in their usefulness.  In this blog, I’ll dig further into these relationships, and explain their impact on my Total Ownership Cost (TOC) solution in TruePlanning.   Let’s use an example of an aircraft being built for the Army.  In this hypothetical example, the prime contractor is Boeing, and they have different departments working on various parts of the aircraft.  Department 10 is responsible for the wings and ...
Original Post Date: Thursday, March 29, 2012 Introduction The new TruePlanning COM Application Programming Interface (API) found in the 2012SR1 release of TruePlanning provides a powerful mechanism for leveraging the power of TruePlanning within custom solutions. Through the COM API TruePlanning projects can be created, updated, calculated and saved allowing for near limitless potential for integration with TruePlanning.  That said it is an “API” which means some programming will need to be done. This discussion is focused on how to get started using the TruePlanning COM API. Development Environments The TruePlanning COM API is written in C++, but is available to any programming ...
Original Post Date: Monday, March 26, 2012 In my research on project management and cost estimation, I often come across three different “breakdown structures” which are useful in dissecting a project from various points of view.  The work breakdown structure (WBS) is oriented around project deliverables; it breaks a system down into subsystems, components, tasks and work packages.  The organization breakdown structure (OBS) shows the structure of the organizations involved in the project, including how these organizations break down into sites, divisions, teams, etc.  Finally, the cost breakdown structure (CBS) examines a project in terms of useful cost categories, ...
Original Post Date: Friday, March 23, 2012 Recently I have been playing around with the International Software Benchmark Standards (ISBSG) database for Development and Enhancement projects.  And in the interest of full disclosure I admit that I am more than a little excited to have close to 6000 data points at my fingertips.  I will further admit that there’s something quite daunting about having this much data; where to start, what should I be looking for, how can I best use this data to offer some useful guidance to inform software cost estimation.  For those of you not familiar ...
Original Post Date: Friday, February 24, 2012 When software developers first starting writing programs for the Windows ® Operating System it wasn’t pretty.   Everything had to be done from scratch – there was no easy access to tools, libraries and drivers to facilitate development.  A similar tale can be told by the earliest web site developers.   A web application framework is an SDK (Software Development Kit) for web developers.  It is intended to support the development of web services, web applications and dynamic websites.  The framework is intended to increase web development productivity by offering libraries of functionality common ...
Original Post Date: Thursday, February 9, 2012  Model Driven Engineering is a software development methodology focused on creating domain models that abstract the business knowledge and processes of an application domain.  Domain models allow the engineer to pursue a solution to a business problem without considering the eventual platform and implementation technology.  Model Driven Development is a paradigm within Model Driven Engineering that uses models as a primary artifact of the development process using automation to go from models to actual implementation.  Model Driver Architecture is an approach for developing software within the Model Driven Development paradigm.  It was ...
Original Post Date: Thursday, December 15, 2011 This week CAST released their second annual CRASH (CAST Report on Application Software Health) Report.   The summary findings can be found here . You will also find a link to the Executive Summary.   The report highlights trends based on a static analysis of the code from 745 applications from 160 organizations.  The analysis is based on five structural Quality characteristics: security, performance, robustness, transferability and changeability.  Some of the more interesting findings include: * COBOL applications have higher security scores that other languages studied (meaning they have better security)  I personally found this finding ...
Original Post Date: Monday, December 12, 2011 Check out this paper “The Economics of Community Open Source Software Projects: An Empirical Analysis of Maintenance Effort.”  In it the authors hypothesize that Open Source practices increase the quality of the software that gets produced and subsequently lead to code that is less costly to maintain.  Low quality code must be refactored more frequently than high quality code and there is substantial evidence that maintenance interventions tend to lead to even more degradation of the quality of the code.  So not only are low quality applications more expensive to maintain, the unit ...
Original Post Date: Monday, November 28, 2011  Check out this blog post  on project estimation.  The author discusses the practice of ‘padding’ effort estimates and how destructive this practice can be to healthy project management.  She suggests that project team members, rather than padding their individual efforts at a task level, should collaborate with project management in order to produce a good solid project plan with sufficient contingency reserves.  This allows for the project plan to reflect the most likely case but contains a safety net for those cases where the stuff that was unknown at the time of project ...
Original Post Date: Tuesday, November 8, 2011  Last week I attended the 26th annual COCOMO forum.  This meeting is an interesting combination of conference and working group and for me it’s a great place to take the pulse of the software and systems estimating community.  Lots of times you’ll go to a conference like this and feel as though the same old things are repeated year after year.  Not so with this conference – it is always a great mix of seasoned practitioners and graduate students with both groups providing forward looking information and inspiration on a variety of ...
Original Post Date: Thursday, October 20, 2011  Check out this article.   “Why IT Projects May be Riskier than you Think”.  If you read through the comments you will see that this article truly resonates with many in the field.  In the article the authors discuss research of over 1471 IT projects (large projects with an average cost of $167 million) comparing budgets and expected performance with actual costs and results.  Their results were surprising in that the average overrun was only 27%.  Turns out that the average isn’t what requires study but rather the outliers.  The study found that ...
Original Post Date: Monday, October 3, 2011 Here’s an interesting article “Technical Debt as Metaphor for Future Cost” ().  In this the author discusses the acceptability of using the metaphor of technical debt to facilitate communications between business leaders and the software team when negotiating around the triangle  (time, money, scope).   And while the  author accepts the use of this metaphor good “short-hand” for communicating the fact that avoiding the work now is not sparing the cost but just rearranging the way the costs are incurred – and often increasing the overall costs that need to be spent.  The ...
Original Post Date: Friday, September 9, 2011 I have recently being following an animated thread on LinkedIn “Death of a Metaphor – Technical Debt.” It’s been live for 2 months with over 200 contributions from dozens of different people.  The discussion was launched by questioning whether continued use of this metaphor makes sense.  The discussion thread weaves and bobs around actually answering this question but it’s amazing how passionate the world is on this topic.  My personal opinion is that it’s a perfectly adequate metaphor because it helps create a discussion between IT and the business leaders in terms ...
Original Post Date: Friday, September 2, 2011  The IEEE published “Top 11 Technologies of the Decade” in the  January 2011 editions of the IEEE Spectrum magazine.  It should come to a surprise to no one that the Smartphone was number 1 on this list.  The answer to the author’s question “Is your phone smarter than a fifth grader” was a resounding YES![1]   In 1983 Motorola introduced the first hand hell cellular phone.  It weighed in at two and a half pounds, had memory capacity for 30 phone numbers, took 10 hours to recharge and had a selling price of $4000 ...
Original Post Date: Thursday, August 25, 2011  Check out this Report on Big Data from McKinsey & Company published in June 2011.  Back in the day, when personal computers were becoming widely accepted and Microsoft Windows© was the new cool thing, SneakerNet was a common means of sharing data.   Certainly the introduction and subsequent improvements of networking technology and the Internet have made data sharing a whole lot easier and quicker.  But the concept of Big Data creates a whole new level of opportunity and potential for collecting and using data in ways heretofore unthinkable. So what is Big Data?   According ...
Original Post Date: Thursday, August 11, 2011 A Bell 430 Helicopter's rear blade spins 1884 times per minute; how many times for the main blade?  Submit your estimate in the comments section!
Original Post Date: Monday, August 8, 2011  PRICE Systems is involved with the formulation of the total Life Cycle/Whole Life of systems of systems and is asking the question, "Is the META 'V' on the right track?" INCOSE is considering the total life of the system, the traditional Development Engineering V must be expanded to account for the life of the system beyond Initial Operational Capability.  In this writer's opinion, we have to extend the development V to account for this with this Meta-V concept.  The Affordability Working Group recognizes that IOC is:  The initial release of both the Primary and Enabling ...
Original Post Date: Monday, August 1, 2011 After many years of working with systems engineers and design engineers it became apparent to me that the cost of the system they were designing / building mostly seemed to be an after thought. Maybe not by the Lead Systems Engineer or Program Manager but certainly down in the trenches. The engineers working at the subsystem, component, and element levels always expressed frustration with having too much to think about to add one more variable, such as cost estimation, to their work load. I posit that this can no longer be the case. In ...
Original Post Date: Friday, July 29, 2011 PRICE Systems is integral in the INCOSE Design for Affordability Initiative as a vital member of the Working Group Internationally. The INCOSE Affordability Working Group’s definition of Affordability is: Affordability is the balance of system performance, cost and schedule constraints over the system life while satisfying mission needs in concert with strategic investment and organizational needs. The INCOSE Affordability Working Group’s definition of SE Design for Affordability is: Design for Affordability is the Systems Engineering practice of balancing system performance and risk with cost and schedule constraints over the system life satisfying system operational needs in ...
Original Post Date: Wednesday, July 27, 2011 Open Source software is software that is distributed publicly with all of its source code.  Users of open source software are encouraged to review the source code, make changes to it and share those changes with the rest of the user community.  The value in open source is that providing the source code to the user community allows those in the community who are willing and able to make improvements, add features, and fix bugs. Open source takes the notion of peer review to the next level.   It means that instead of ...
Original Post Date: Wednesday, July 20, 2011 Electronics Complexity and Quality Levels…   The PRICE Calculator for Electronics Manufacturing Complexity, shown below, has a subtle but very powerful feature for Quality Adjustment, based on Mil-Hdbk-217E Quality Levels.    The above shows “None” which yields the standard MCPLXE, in this case a 8.07 value applying 100% Large Scale Integrated Circuits for Display (with CRT) equipment as an example. Yet, in the course of estimating a mission-critical automatic tester  with multiple controller-card assemblies for an airborne-military platform, I adjusted for more stringent quality levels—specifically “S-1” level consistent with Mil-Std-975/ Mil-Std-1547 per below.   Note that ...
Original Post Date: Wednesday, July 20, 2011 How many golf balls would it take to circle the Earth at the equator?  Submit your estimate in the comments section!
Original Post Date: Tuesday, July 12, 2011 Check out this article from CIO magazine about managing your project budget.   The author, Jason Westland, suggests four things necessary to maintain control of your project budget.  While these are not earth shattering suggestions, sometimes project managers in the throes of a project can lose sight of their importance.  The strategies are: * Continually forecast the budget * Regularly forecast resource usage * Keep the team informed * Manage scope meticulously Or to put it another way – respect and revisit the Project Management Triangle.  (To learn more about the Project Management Triangle go to this ...
Original Post Date: Monday, July 11, 2011 There is a ton of code out there and we’re constantly adding more.  Gartner reported in 2009 that there were more than 310 billion lines of code in use worldwide. MS Word has grown from 27,000 lines of code in the first version to about 2 million in 2010.  The Windows Operating System grew from 3 million lines of code in 1992 to 20 Million in 1998.  You get the point – there’s lots of code out there doing lots of the same things that we may want our software to do. One ...
Original Post Date: Tuesday, June 21, 2011  What percent of active U.S. military personnel are from the Marine Corps?  Submit your estimate in the comments section!
Original Post Date: Friday, June 17, 2011  Building transparency and traceability into your estimating process leads to more defendable estimates and we can help you do that. We will demonstrate how historical data is transformed into predictive models.   You will learn how your data can be synthesized into custom models that can be employed in support of third party models within a single analytical framework. Learn more at our webinar on June 29th @ 11am Eastern.  Reserve your no-charge; no obligation webinar seat now at: https://www2.gotomeeting.com/register/372682434
Original Post Date: Wednesday, June 15, 2011 At the 2011 ISPA Conference, I conducted a ½ day workshop How To Develop Data-Driven Cost Estimating Relationships  in TruePlanning. The attendees at the workshop learned how to import their own data into TruePlanning and develop custom Cost Estimating Relationships. We covered three case studies:   ·         In the UCAS case study we demonstrated how we can build CERs at a higher level to provide a test of reasonableness to the CAPE. ·         In the SRDR case study we demonstrated how we develop a CER to estimate SLOC based on historical data and use the results ...
Original Post Date: Wednesday, June 15, 2011 While teaching an introductory TruePlanning for Software Estimating course this week at an Army location, I was asked to follow up with a clarification on “percent adapted” calculation.  The official PRICE training materials definitions are:   • Percent of Design Adapted - the percentage of the existing (adapted code) design that must change to enable the adapted code to function and meet the software project requirements;   • Percent of Code Adapted - the percentage of the adapted code that must change to enable the adapted code to function and meet the software project requirements.   The former, Design, ...
Original Post Date: Friday, June 10, 2011 I’m on my way home from the ISPA/SCEA (International Society of Parametric Analysts, Society of Cost Estimating and Analysis) Conference held in Albuquerque this week.  Attendance was very good (2nd best in the conferences history) and as the content seemed especially good this week.  I attended lots of good talks on topics ranging from SEPM (System Engineering, Project Management) cost estimating, Joint Confidence Levels, Software Estimating, Affordability,  Agile software development and estimating for Enterprise Resource Planning Systems.   Of course, just because the topics are good and well presented doesn’t mean I have ...
Wednesday, June 8, 2011  What is the fuel capacity (in gallons or liters) of a Boeing 737 jet?  Submit your estimate in the comments section!
Original Post Date: Friday, June 3, 2011  If I Google the phrase “cloud computing” I get about 49,900,000 hits.  That’s a lot of hits – more than 10 times the hits I get if I Google “service oriented architecture.”  This made me think that cloud computing is an area I needed to learn more about. So what are we really talking about when we talk about cloud computing?  “The cloud” is a generally accepted euphemism for the Internet.  End users access computing assets from the cloud using a model similar to one that homes and offices use to get electricity ...
Original Post Date: Wednesday, May 25, 2011  Going to ISPA SCEA in New Mexico?  If so, join us for a workshop on data driven cost estimating.  Description:  Building transparency and traceability into your estimating process leads to more defendable estimates.  This hands-on workshop demonstrates how historical data is transformed into predictive models.   You will learn how your organization’s data can be synthesized into custom models that can be employed in support of third party models within a single analytical framework.  Participants will learn:  (1)   To develop system level estimating relationships to provide a test of reasonableness and historical cross-check to proposed estimates. (2) To develop ...