• Estimate with Confidence ™



Blog



Here is a taste of the NASA Symposium presentation...   Wikipedia offers a nice overview of the history of software engineering describing the stops and starts that have led us to where we are today.  [1] In the beginning, the complexity of the applications being developed was almost overshadowed by the logistics of the actual implementation process.  People didn’t have personal computers and feedback was not instantaneous – and it was impossible to make predictions about when a project would be complete.  As technology and tools improved, software was used to solve increasingly complex problems so the logistics demurred to the ...
In March 2018, NASA announce a ninth round of Candidates participating in the CubeSat Launch Initiative.  As of March 2, the organization has selected 158 CubeSats from 39 states, and launched 59 missions as part of the Educational Launch of Nanosatellites (ELaNa) Program.  CubeSats are everywhere, and obviously, we want to know how much they will cost to design and build.  We expect that the projects will save money overall, by getting small missions up in space quickly, with shorter mission life, and shorter development schedules.  Since many of the teams participating in this early CubeSat experiments are Universities, costs and effort are not always tracked in a way that ...
Background During cost estimation training events and consulting efforts, the author is often asked about the similarities and differences between two common risk methodologies: Method of Moments and Monte Carlo. These methodologies are used in a number of commercial software tools, such as TruePlanning®, Crystal Ball®, and @Risk®. Cursory observations show that the results are similar. However, a full study of the similarities and differences has not been done with regard to the behavior of these two risk methodologies within a commercial parametric estimating framework such as TruePlanning®. Independent of the method used, one of the most abstract tasks facing the ...
PRICE Cost Analytics – TruePlanning is the perfect way to do it! Model Based Engineering (MBE) environments are naturally rich with Engineering data including Computer Aided Design (CAD) drawings with Size, Weight and Power (SWAP) budgets. This data is perfect to drive parametric cost models like TruePlanning. All you need to do is create a ‘glue code’ interface to get that data out of where it currently exists, and into TruePlanning. Fortunately, COTS integration environments, like Phoenix ModelCenter, make this easy to do, where a TruePlanning ‘Plug-in’ has already been developed to provide that interface. MBE in a ModelCenter-like environment ...
This year at the International Cost Estimating and Analysis Association (ICEAA) Conference I will be talking about a journey towards a more perfect software sustainment estimate.  Many a path has been traveled on this journey but alas there are more miles to go.  My original intent in submitting this paper was to present interesting and groundbreaking results that would provide invaluable guidance to the cost community for their software sustainment efforts.  And I firmly believe that at the end of the day the project which I discuss will deliver exactly that but we’re not quite there yet.  Those of you who collect and ...
PRICE Systems has received several requests for the ability to produce a report that iterates over the PBS elements of a TruePlanning project. One common use for such a report would be a Basis of Estimate report. Another common use would be as a ‘brief’ of the estimate. The core concept is that a report could be generated that would contain information for each PBS element in the project and have a discrete report created for each element. The exact target for each discreate report could be variable. Users might want to have a report for each Activity for ...
On April 17th, 2018 I will be presenting a webinar through the IT Metrics & Productivity Institute (ITMPI) where I will discuss, the Internet of Things, the Network of Things, enabling technologies along with examples of IoT/NoT implementations that are changing the way people live and work. The live webinar is free and registration is found at this link. In 1926, Nikola Tesla said, “When wireless is perfectly applied the whole earth will be converted into a huge brain, which in fact it is, all things being particles of a real and rhythmic whole…. And the instruments through which ...
Will we use giant robots in the military? The likelihood of the US military making use of well-functioning giant robots is higher than you’d think. In fact, the military has successfully used several different types of robots for years. Drones like the MQ-9 Reaper supplement our air fleets and hunt down targets. Robots like the QinetiQ TALON aid in surveillance, bomb defusing and item retrieval. There is so much interest in improving robotics that the US government has invested heavily in robotics companies like Boston Dynamics to build military-grade robots that can actually walk. So how close are we ...
The age of giant, fighting robots is here. At least, that was the premise for the Giant Robot Duel between MegaBots, Inc. (Team USA) and Suidobashi Heavy Industry (Team Japan), which took place in Japan and started streaming online in October 2017. The long-awaited contest ended in a display reminiscent of BattleBots with fireworks, sparks and chainsaws galore. Unfortunately, the duel’s reception was mixed, with many viewers left disappointed and even bored due to the robots’ limitations. Though appearing humanoid, MegaBots’ Eagle Prime and Suidobashi’s Kuratas both required tracks or wheels to maneuver and didn’t show the agility or ...
North Korea’s newest nuclear missile, the Hwasong-15, is now able to reach the continental United States. The global community has condemned missile testing, and President Trump has stated that North Korea will be met with “fire and fury” should the country threaten the US. And the situation only continues to escalate as Trump and Kim Jong Un trade barbs over the size of their nuclear launch buttons. As expected, many of our friends and family members (along with everyone else) have started to worry about the prospect of nuclear war. But we can ease our worries a little if ...
Forrester Research predicts that by 2022, more than 66 million households in the US – about 50% of all homes – will have a smart speaker like Amazon Echo or Google Home.1 Our TVs are only going to get smarter, too; through integration with smart speakers, consumers will have complete hands-free control, navigating their televisions simply by speaking. But consumer technology, such as home automation and personal electronics, isn’t the only big player in the tech industry. 2018 will bring new advances, applications and discoveries, so let’s review some of 2017’s most trending technologies and how they will evolve in ...
If you’re in charge of cost estimating for software development projects, there are several potential oversights you could make that may lead to faulty estimates and unwanted cost surprises. Let’s review a few common project process misconceptions and the added costs that you might be missing in your estimates: 1. Forgetting to incorporate infrastructure costs Successful software development projects require more than just a few desktops and a network connection. Crucial infrastructure elements that need to be set up for the development team include configuration of the source code repository, artifact management servers, test environments for manual and automated test activities ...
Simply Estimating Air Force One Do you remember a story from late 2016 that dealt with cost estimation, Air Force One and then President-Elect Donald Trump? At the time it appeared in Americans’ Twitter feeds, the tweet below caused a lot more confusion than it should have. Situations like this always pique the interest of PRICE staff because we understand how cost estimation affects the success of projects like the new Air Force One and why it’s important to use accurate and robust estimating software. Let’s talk about the repercussions of this controversial message and clear up misconceptions you might ...
Check out this article  - “New software can detect when people text and drive” – turns out that researchers at the University of Waterloo have created algorithms that recognize distracted driver behaviors such as texting, cell phone usage or reaching into the backseat to retrieve items.  Using cameras and artificial intelligence to detect and classify distracted behaviors. Machine learning employed to ‘inform’ the algorithms that are at the heart of this technology.  The algorithms build on work that has been done at the university’s Centre for Pattern Analysis and Machine Intelligence creating intelligence to recognize blinking eyes, pupil dilation, ...
Check out this presentation by Dr. Ken Nidiffer of the Software Engineering Institute (SEI) at Carnegie Mellon University – presented at the STC 2017 Conference at NIST.  According to NDAA 2013, Section 933  “Software assurance provides the required level of confidence that software functions as intended (and no more) and is free of vulnerabilities, either intentionally or unintentionally designed or inserted in software throughout the lifecycle.” It was clear from this and several other presentations at the conference that the way to achieve software assurance is to integrate it thoroughly into the system acquisition lifecycle.  Nidiffer detailed some of the ...
I just recently attended the 28th Annual IEEE Software Technology Conference (STC) sponsored by IEEE and hosted and the National Institute of Standards (NIST).    The conference provided attendees with incredible quality content – 8 wonderful keynote sessions and 51 great presentations (OK – I didn’t attend all of them obviously but the ones I did were insightful, useful and informative.  STC was founded in 1989  by the Assistant Secretary of the Air Force for Acquisition (Communications, Computers, and Support Systems), Mr. Lloyd Mosemann, through the Software Technology Support Center (STSC) at Hill Air Force Base.  The purpose of the ...
With the latest update for TruePlanning 16.0, we have a new plugin for Phoenix Integration’s ModelCenter tool.  ModelCenter aids in the design and optimization of systems, and is a popular tool with many TruePlanning users in the aerospace and defense industry.  The new plugin will allow users to integrate TruePlanning’s cost models with computer-aided design (CAD) tools, simulation tools, performance models, Excel, etc. and automate data sharing between them. By linking these models/tools together, you can do in-depth studies on the entire trade space for a design, and find optimal answers to very complex design questions.  And with the ...
Several weeks back I attended the Practical Software Measurement (PSM) Users Group in Crystal City Virginia.  This is a small but good conference that combines presentations on many aspects of measurement for software and systems with workshops in the afternoon where government, industry and academia work together to address issues of import to system and software measurement.  As you might imagine, there were several presentations focused specifically on cybersecurity - a topic that is becoming more and more of an issue in our industry. All were quite good but one particularly enlightening presentation was presented by Joe Jarzombek of Synopsis ...
Driverless cars – brilliant idea or completely frightening?  But if you look at the Google Car – which has logged over 700,000 road miles with only two accidents both of which involved missteps of a human driver – maybe it’s not such a bad idea.  Joshua Schank of the Eno Center for Transportation is quoted here as saying “People are not great at driving – 30,000 people die in in car accidents each year (in the United States).  Machines can be much better than humans when it comes to driving; they don’t drink or text and can think faster”.  ...
The Capability Maturity Model Integration – Development (CMMI-Dev) is a model designed to help organizations achieve and institutionalize process maturity. CMMI specifies the goals that need to be achieved to reach a specific maturity or capability level – it is neither rigid nor prescriptive with how exactly these goals are to be met.   Agile is a philosophy and set of tenets  for software projects that are characterized by highly collaborative, cross-functional teams who work closely with their customers to deliver regular increments of functional software capability that the customers and end users are happy with.  Neither the agile philosophy ...
It is common for consumers of TruePlanning Cost Analysis to request some type of uncertainty analysis and this is frequently a request for an 80% confidence level report. TruePlanning can produce an uncertainty report and provide confidence levels, but some organizations have a prescribed method for generating uncertainty analysis based on the use of Monte Carlo. Many of these organizations use @Risk to perform Monte Carlo analysis. TruePlanning has an integration with TruePlanning so TruePlanning users can satisfy the development of an 80% confidence level base on inputs in TruePlanning projects. Recently a customer contacted PRICE and indicated that ...
TruePlanningXL is the new Excel based interface to TruePlanning. TruePlanningXL gives uses the ability to work with TruePlanning projects in Excel. This includes allowing users to update inputs, calculate, calibrate and even create new projects. Much of the work performed in Excel on TruePlanning projects involves setting input values. The majority of inputs are single values and are available on the PBS sheet provided by TruePlanningXL. There are, however, some inputs that do not fit well on the PBS sheet: temporal inputs. Temporal inputs are inputs that contains multiple values to reflect values over time. In the TruePlanning application, ...
The 2017 Crash Report is now available.   This is a report of CAST’s Research on application software health.  The results are based on a study of the structural quality of 1850 applications totaling more than one billion lines of code from around the world based on five health factors that measure: Robustness – measuring the likelihood of outages and time to repair based on poor implementation practices Security – measuring violations of secure coding practices which can lead to security breaches and data theft Performance Efficiency – measuring potential performance  ...
Field Programmable Gate Arrays (FPGA) are integrated circuits designed to be configured by a designer after manufacturing.  In recent years, FPGA usage has been increasing at a rapid pace, as their capability (speed, energy efficiency, amount of logic that can fit on the chip, etc.) has come to rival ASICs.  As both the number and size of FPGA projects has increased, improving methods of cost estimation of these projects is becoming more critical for project success. FPGA development combines aspects of both hardware development and software development.  These projects begin with architectural design and writing code in a Hardware Description ...
Check out this article on “The History and Purpose of the Capability Maturity Model (CMM)” (https://toughnickel.com/business/The-History-and-Purpose-of-the-Capability-Maturity-Model-CMM) It provides an interesting and thought provoking accounting of how the Carnegie Mellon University’s (CMU’s) Software Engineering Institute (SEI) came to be and how the quest of NASA and the US Air Force lead the charge to improve software quality.  According to the article – “The Capability Maturity Model was developed to ensure success when success really matters – at NASA and in the military where lives are on the line and success is survival”.  The problem the industry had with this quest ...
In my previous blog, I discussed the newly available in TP2016.0 (16.0.6187.2) API feature of calibration. Now reliable and repeatable calibrations can be set up through the TruePlanning API easily and quickly. For most users the standard calibration usage will be all they need. That said, there are some users who would like to use a feature that is available the TruePlanning Calibration GUI: the ability to calibrate multiple inputs in one calibration.  The most common reason to use a constant in a calibration is to find a single value, such as Organizational Productivity, across multiple Cost Objects that yields ...
Of the many features that help TruePlanning stand out as the premier predictive cost analytics tool, TruePlanning’s ability to calibrate is frequently cited by users as the most important feature in TruePlanning. Users can create models that are highly tuned to the organizations and processes that they are modelling. The 50,000-foot description of calibration is to increase the fidelity of a model to a known process or organization by using ‘actuals’ (known result data) and driving TruePlanning inputs so that the known values are obtained. A concrete example would be knowing the number of hours it took create a ...
Check out the November/December issue of Crosstalk (http://www.crosstalkonline.org/issues/novdec-2016.html) “Beyond the Agile Manifesto.”  Here you will find several really great articles on the uses and the future of agile development.  As usual I started at the end with David Cook’s Backtalk article – “Too Agile for my Own Good”( http://static1.1.sqspcdn.com/static/f/702523/27309805/1477697823283/201611-Cook.pdf?token=ayD00sE4rcknGJpn6XEk1O08YBk%3D).  Cook not only shared whimsical information about his favorite grocery items along with his curmudgeon-like  frustration with the stores insistence on changing things up occasionally by moving familiar items to unfamiliar locations throughout the store.  Cook admits however that this behavior – while originally annoying has resulted in improvements in the ...
Crowdsourcing is the practice of harnessing the power of the crowd to solve problems or accomplish certain tasks.  The expression Crowdsourcing was coined as a portmanteau of the words crowd and outsourcing.  While advances in technology have pushed this practice to the forefront recently – the notion has been around for a long time. It is a participative online activity where questions and tasks are proposed by an individual or organization via a particular crowdsourcing platform.  Individuals or groups of individuals who belong to that community – accept the challenge and attempt to answer the question or complete the proposed ...
One of the most frequently asked questions from new users is:  At what level should I be estimating my project in TruePlanning®?  To be clear, the question involves how detailed the product breakdown structure should be.  The answers can vary widely, as seen in the two examples below for an aircraft. Figure 1.  F-22 template available in TruePlanning® 2016 Note: subassemblies collapsed to fit PBS onto one page.   Figure 2.  F-22 estimated as one hardware cost object. The preceding examples might be extreme ends of a spectrum.  In the first example, we have provided a template in TruePlanning® 2016 that models a fighter ...
We have all likely heard of “should cost” estimates.  Boiled down: If everything goes as planned, how much should program cost?  Are there efficiencies or best practices we can employ to get the job done faster and cheaper?  We may use analogies, parametrics, bottom up estimates, or extrapolation of actual costs to make a determination.  How many of us have heard of “should price”?  “Should price” is the determination of reasonableness of price for commercial items.  When contracting personnel receive quotes for off-the-shelf items, they need tools at their disposal to determine if the item is fairly priced.  On ...
Recently, I was contacted by an Air Force estimator with a novel challenge.  The estimator’s product breakdown structure modeled the development and production of various hardware items over several years.  TruePlanning can easily assist in spreading production quantities or tracking “ship sets” of items required for a weapon system.  In this case, the user wished to have different quantities for different items.  Typically, we use a System cost object or the System Folder to spread production, which is then attributed to all components within the estimate.  Coming up with multiple production schedules within the same estimate presented a challenge. In ...
In the previous (second) blog in this series, we discussed using the NIST Special Publication 800-171 Appendix E to list all possible cyber security requirements.  We then down selected the entire list of 123 items into roughly 60 that may directly impact the software development process.  Now, we will cover how the impact of those 60 items could possibly be included in a TruePlanning® estimate. I will offer three primary methods for accounting for additional effort of cyber security requirements.  We will look at modeling the requirements as individual cost objects in the estimate.  We will then consider setting inputs ...
We will pick up where we left off on estimating the cost of cyber security by looking at requirements.  Recall from a previous blog that the requirements for Cyber Security are outlined in Appendix E of the National Institute of Standards and Technology (NIST) Special Publication 800-171 document titled “Protecting Controlled Unclassified Information in Nonfederal Information Systems and Organizations.”  In Appendix E, there are a series of tables that outline the requirement, as well as the responsible authority for ensuring those requirements are met.  There are four categories of requirements*: NCO: Not directly related to protecting ...
A recent article in the National Defense Magazine highlighted the ever increasing need for cyber security.  (See http://www.nationaldefensemagazine.org/articles/2016/12/12/pentagon-paying-more-to-be-hacked)  When working on a software estimate for a program office here at Wright-Patterson AFB, I was asked “how do you handle cyber security requirements?”  My response was, “What does that mean for your program?  How are the requirements different?”  There was no good answer.  We may be required to incorporate cyber security requirements into a new software project, but there is no really good guidance as to what that exactly means.  We can probably assume that costs are higher for a ...
We’ve added many cool new features to our Hardware estimating model in TruePlanning® 2016, and I’d like to introduce the new Learning Curve features in this blog. First, we’ve added the ability to pull out the theoretical cost of any individual production unit, anywhere along the learning curve.  We call it the “Nth Unit Cost” metric.  This has been requested by many users, notably Air Force users and aircraft manufacturers, where it is common to use the 100th unit cost as a reference point to compare different types of aircraft.  Simply enter a value for “N”, and the Metrics tab ...
TrueMapper strives to provide an environment that allows users to be as efficient as possible when creating mappings. Sometimes users need to map all of a specific type of Cost Object, Activity or Resource to specific WBS items. In this case users would use the “Generic” tab of the Mapping Assignment Window. Assignments made from this tab can be stored off in a *.tpgen file and used against any PBS structure. Recently, some TruePlanning users came across a situation where they needed to assign all Resources from specific Cost Objects to a specific WBS element, except one. Those familiar with ...
TruePlanningXL is the TruePlanning/Excel interface reboot. Excel is a popular topic in the PRICE universe because Excel is clearly the most popular tool in the cost analytics quiver. Customers such as the Air Force, Army and the majority of the primary suppliers have processes inextricably based on Excel. One of the primary design goals of TruePlanningXL is for it to be as easy to use as possible. To support this, great effort is being made to study how it is used and improve TruePlanningXL’s usability based on the research. A second primary design goal is for TruePlanningXL to be ...
In our previous blog, we defined PCA and PRICE’s unique approach via TruePlanning.  This blog is the seventh of seven to describe by specific impacts on key business processes and functional areas that benefit from this capability. Affordability is an abstract term that most people think they understand but have difficulty articulating. The United States Department of Defense defines affordability as the degree to which the life-cycle cost of an acquisition program is in consonance with the long-range investment and force structure plans of the Department of Defense or individual DoD Components. However, the Department of Defense acquisition regulations require program ...
In our previous blog, we defined PCA and PRICE’s unique approach via TruePlanning.  This blog is the fifth of seven to describe by specific impacts on key business processes and functional areas that benefit from this capability. Estimates based more on engineering judgment than statistical data lack credibility. This comparison is especially true when proposing design alternatives to meet a customer’s unique program requirements. Companies that can’t demonstrate a rigorous affordability management process score low with their proposals because their bids presented without statistical data to back their estimates appear over-optimistic and risky, or too expensive. This situation ultimately has a ...
In our previous blog, we defined PCA and PRICE’s unique approach via TruePlanning.  This blog is the fifth of seven to describe by specific impacts on key business processes and functional areas that benefit from this capability. So what are my competitors most likely to bid? Competition Ghosting (sometimes called competitive analysis), can be an important capability when preparing for opportunity qualification or the bid & proposal development. Estimating what you believe your competition might do is extremely useful intelligence for your own Price-to-Win process. Ghosting empowers the user with an objective assessment of the competitor’s offering.  This is done for two ...
In our previous blog, we defined PCA and PRICE’s unique approach via TruePlanning.  This blog is the fourth of seven to describe by specific impacts on key business processes and functional areas that benefit from this capability. Effective management of a system or program starts with a budget in the front end and execution control and evaluation in the back end. Effective cost control can be achieved if the organization has a dynamic and flexible cost estimation capability which sets a baseline in the budgeting process and updates the costs / estimate-to-complete quickly and with minimum effort. Effective cost control is ...
In our previous blog, we defined PCA and PRICE’s unique approach via TruePlanning.  This blog is the third of seven to describe by specific impacts on key business processes and functional areas that benefit from this capability. Budgeting is a critical first step in developing an effective cost management plan. The ability to integrate your specific budget tools with a predictive cost estimation engine, whose output can be tailored to your organization’s cost activity center structure, is a very powerful capability. The requirements to achieve this level of budget management consist of the following: Accessing historical budget ...
In our previous blog, we defined PCA and PRICE’s unique approach via TruePlanning.  This blog is the second of seven to describe by specific impacts on key business processes and functional areas that benefit from this capability. For estimators and engineers, the PRICE Cost Analytic Framework® improves cost estimating speed and accuracy, with built-in flexibility to evaluate multiple alternatives quickly. The integrated set of data, tools, and processes enables organizations to successfully estimate and analyze the effort, schedule, and cost of projects.  Specific estimation applications include Independent Cost Estimation (ICE), should-costing, vendor/supplier assessments, cost-benefit trade studies, proposal baselining… and more. PRICE® ...
In our previous blog, we defined PCA and PRICE’s unique approach via TruePlanning.  This blog is the first of seven to describe by specific impacts on key business processes and functional areas that benefit from this capability.   PRICE® Cost Analytics (“PCA”) provides the tools and a systematic approach to enhance the effectiveness of Total Cost Management (“TCM”) through the application of cost management principles, proven methodologies and the latest predictive technology in support of the management process and its decision-making. PCA and TCM both start with a simple concept based on the time-tested Deming or Shewhard cycle. In essence, TCM and ...
  On this website, you will find multiple blogs as well as other links to descriptions of v2016’s new features and capabilities, often in the context of Predictive Cost Analytics and specifically our process-instance of PRICE® Cost Analytics or “PCA”… But what exactly is that, you might ask? PCA is a proven predictive analytics approach focused on cost management (Cost Estimation, Cost Budgeting, and Cost Control) that delivers the answers you need, when you need them.  As cost, profitability and affordability are a critical part of your decision criteria, PRICE® Cost Analytics will make you more confident in your decisions. PRICE® Cost ...
Your license has been updated to include TruePlanning 2016’s new IT Services catalog.  This capability is a significant upgrade to v14.2’s IT Infrastructure catalog.  In the past with the latter, representative of typical logistical equipment build-up philosophies, estimates were focused primarily on operations of existing/deployed IT enterprises.  Now, with IT Services, you can estimate IT deliverables as services, delivered with in-house and/or contracted resources.  This catalog was specifically designed to better align with the way organizations currently estimate and budget for IT services.  There are now 8x2 = 16 specific types of estimating capabilities, supporting both New Projects and ...
I hope you have enjoyed my series of blogs on the Four Pillars of Affordability. I have tried to impart to you what I have learned during my 30 years in the Defense Industry and the last 25 years as an Affordability Manager and cost estimator. The purpose of the Four Pillars of Affordability is to enable you, and your program, to develop a product for your customer that is considered affordable and ultimately, a value to them. My definition of Affordability is: Affordability is the process that balances performance with price to meet customer needs. A product is ...
When first creating your resume it can be overwhelming to know where to start. Personally, I would start by opening a Word Document and look through the pre-created templates. These templates help guide you to the best place to insert your information and provide an overall layout of a good resume. The same thing happens when you open TruePlanning® to create your first cost estimate. Like the pre-created templates in Microsoft Word, our templates were created by our cost research team to provide you with a guide on how to get started on your estimate. One of the templates we ...
In my first blog of this series I introduced you to the Four Pillars of the Affordability Process. They are: 1.       Management Support 2.       Methodology 3.       Training 4.       Tools/Automation In this fifth blog I write about the Fourth Pillar; Tools/Automation, teaching you what I have learned over the past 25 years as an Affordability Manager and Cost Estimator. Now that Management Support is in place, a written Methodology has been published giving Affordability the legitimacy it needs, responsibilities delineated, and templates to follow, and both Management and Design IPT Members have been trained, it is time to focus on the tools, automation, and support personnel ...