• Predictive Analytics for Improved Cost Management  



Blog



Original Post Date: Monday, July 11, 2011 There is a ton of code out there and we’re constantly adding more.  Gartner reported in 2009 that there were more than 310 billion lines of code in use worldwide. MS Word has grown from 27,000 lines of code in the first version to about 2 million in 2010.  The Windows Operating System grew from 3 million lines of code in 1992 to 20 Million in 1998.  You get the point – there’s lots of code out there doing lots of the same things that we may want our software to do. One ...
Original Post Date: Wednesday, June 15, 2011 At the 2011 ISPA Conference, I conducted a ½ day workshop How To Develop Data-Driven Cost Estimating Relationships  in TruePlanning. The attendees at the workshop learned how to import their own data into TruePlanning and develop custom Cost Estimating Relationships. We covered three case studies:   ·         In the UCAS case study we demonstrated how we can build CERs at a higher level to provide a test of reasonableness to the CAPE. ·         In the SRDR case study we demonstrated how we develop a CER to estimate SLOC based on historical data and use the results ...
Original Post Date: Friday, June 10, 2011 I’m on my way home from the ISPA/SCEA (International Society of Parametric Analysts, Society of Cost Estimating and Analysis) Conference held in Albuquerque this week.  Attendance was very good (2nd best in the conferences history) and as the content seemed especially good this week.  I attended lots of good talks on topics ranging from SEPM (System Engineering, Project Management) cost estimating, Joint Confidence Levels, Software Estimating, Affordability,  Agile software development and estimating for Enterprise Resource Planning Systems.   Of course, just because the topics are good and well presented doesn’t mean I have ...
Original Post Date: Friday, June 3, 2011  If I Google the phrase “cloud computing” I get about 49,900,000 hits.  That’s a lot of hits – more than 10 times the hits I get if I Google “service oriented architecture.”  This made me think that cloud computing is an area I needed to learn more about. So what are we really talking about when we talk about cloud computing?  “The cloud” is a generally accepted euphemism for the Internet.  End users access computing assets from the cloud using a model similar to one that homes and offices use to get electricity ...
Original Post Date: Wednesday, May 25, 2011  Going to ISPA SCEA in New Mexico?  If so, join us for a workshop on data driven cost estimating.  Description:  Building transparency and traceability into your estimating process leads to more defendable estimates.  This hands-on workshop demonstrates how historical data is transformed into predictive models.   You will learn how your organization’s data can be synthesized into custom models that can be employed in support of third party models within a single analytical framework.  Participants will learn:  (1)   To develop system level estimating relationships to provide a test of reasonableness and historical cross-check to proposed estimates. (2) To develop ...
Original Post Date: Tuesday, March 1, 2011 The concept of the fuel cell was first published in 1938 by Christian Friedrich Schonbein.  Based on this publication Sir William Grove invented the precursor of the fuel cell in 1839. The Grove Cell created current by applying two acids to zinc and platinum electrodes separated by a porous ceramic pot.  In 1842 Grove developed the first actual fuel cell which produced electricity with hydrogen and oxygen, much like many fuel cells in use today. Fuel cells remained an intellectual curiosity until the 1960’s when the US space program identified a requirement for ...
Original Post Date: Monday, February 28, 2011 What follows is PRICE's interpretation of the DOD-HDBK-343, which addresses design, construction and testing requirements for a type of space equipment. Within the document are specified several levels of Class Definitions for space programs, space vehicles and space experiments. The classes are briefly described below. Class A - High Priority, Minimum Risk Class B - Risk with Cost Compromises Economically Re-flyable or Repeatable Minimum Acquisition Cost HDBK-343, originally published in 1986, was reviewed and found to be still valid in 1992.  We can't due ...
Original Post Date: Thursday, February 17, 2011 My June blog entry suggested the use of parametrics in real-options valuation. This month, I’d like to offer the generalized use of our type of modeling in valuing tangible assets.  Typically, fundamental analysis evaluates the intrinsic value of securities. I won’t attempt to compete with Warren Buffet here. But it is certainly the case that a company, or portfolio of securities reflecting many companies, is based in part on the market value of its product assets and their potential for future earnings, as well as other objective and subjective considerations. In parametric estimation, we take a top-down ...
Original Post Date: Monday, November 15, 2010 Last week I attended the 25th International Forum on COCOMO and Systems/Software Cost Modeling.  I attended for several reasons.  First of all, I was invited to participate on a panel whose topic was “25 years of Software Estimation: Lessons Learned, Challenges and Opportunities”.  Secondly, I have attended in the past and while it’s generally a small group, as such conferences go, I always come away impressed by the fact that so many smart people end up in one room and this year was no different.   But I digress; I really wanted to share ...
Original Post Date: Tuesday, November 2, 2010  After some recent meetings with clients I am sensing some confusion on how to estimate software reuse. I think part of the problem is in the definition of reuse, so let's start with a definition and then address the estimating issue. Software reuse is defined as “the use of existing software, or software knowledge, to build new software.” This definition came from Wikipedia. From a estimating software costs perspective the above definition is part of the problem. The definition should read: "Use of existing software with no changes for operation in the new software program.”  If the existing software is going to be changed, ...