• Predictive Analytics for Improved Cost Management  



Blog



Check out this article on “The History and Purpose of the Capability Maturity Model (CMM)” (https://toughnickel.com/business/The-History-and-Purpose-of-the-Capability-Maturity-Model-CMM) It provides an interesting and thought provoking accounting of how the Carnegie Mellon University’s (CMU’s) Software Engineering Institute (SEI) came to be and how the quest of NASA and the US Air Force lead the charge to improve software quality.  According to the article – “The Capability Maturity Model was developed to ensure success when success really matters – at NASA and in the military where lives are on the line and success is survival”.  The problem the industry had with this quest ...
Original Post Date: Wednesday, June 23, 2010 Parametric modeling is excellent for all aspects of early-concept cost estimation, including go/no-go decisions downstream. So, in the spirit of bringing a transparency to (ethical) financial engineering… why not apply our craft to pricing “real-options”? The latter are essentially strategic opportunities for engaging resources (cost/schedule) into projects, ventures, investments, or even abandonments. The opportunity choice has value itself!  Unlike static project Net Present Value (often, but not exclusively, approximated with Discounted Cash Flow) assuming pre-defined decisions, real-options reflect the merit of flexibility. If an R&D or proof-of-concept presents viability/ marketability learning, the option has positive value, above ...
Original Post Date: Thursday, October 7, 2010 Ahhhh, the 80s… a challenging (but often confusing) time in an evolving computing world.  Working in 1985 as a software estimator as well as SQA engineer in a quality assurance department that “audited” real-time projects using new concepts like OOD & OOP… well, you get the picture.  It was a great time to get immersed into great work.  And the good news:  that company’s process as well as its developers were bullish on a young estimation/ quality types asking plenty of questions… as long as they were of the Yes-No variety.  And ...
Original Post Date: Monday, December 6, 2010  In his August blog-entry here, Zach Jasnoff outlined typical client perspectives for the different types of analyses that TruePlanning can accommodate. Working on a large project, we’ve experienced situations that, realistically, can happen where the initial intent and model structuring later have the boundaries of model appropriateness stretched. An Analysis of Alternatives (AoA), for example, is meant to measure deltas between baseline and its alternatives. If common costs “wash” then they can be excluded… which becomes an issue when treated as a Rough Order Magnitude for customer budgeting.  Likewise, if a ROM or Independent Cost Estimate ...
Original Post Date: Thursday, August 12, 2010 The late Norm Crosby’s “Quality is Free” taught us that an investment into quality is more than offset by prevention of defects based upon understanding of requirements. Only with the latter can lack of conformance (and subsequent costs) be captured and hence quality quantified. So how then is Parametrics relevant?  Parametric estimating is more than cost modeling. Our craft represents an initial consulting function into the accuracy and completeness of program planning concepts. Our customers trust us to know when to ask and when to supplement. Yes, we are mathematical and financial modelers too. But I’d suggest that “Parametrics is ...
Original Post Date: Wednesday, August 25, 2010 To me the greatest strength of TruePlannning®  is its flexibility and wide range of parameters. AND, to me the greatest weakness of TruePlanning®  is its’ flexibility and wide range of parameters. The nature of the TruePlanning® framework, the idea of cost catalogs with cost objects, and the implementation of activity based costing allows or provides for a wide range of solutions to our cost estimating requirements. TruePlanning’s ® ability to address a wide range of cost problems lies in its flexibility and extensibility. The clever use of worksheets, multipliers, and the product breakdown ...
Introduction: The goal of this blog is to show how data can flow between TruePlanning and ProPricer. This walkthrough is based on estimating a software application that will provide users the ability to track objects orbiting the Earth using a feed from some fictitious data stream. The benefit is the ability to get the labor requirement (effort in hours) estimated by TruePlanning into ProPricer in a seamless, easily repeatable process.   1. Create ProPricer Proposal for the Orbiting Body Tracking application The first step is to create a proposal in ProPricer with a WBS. Each task in ProPricer will have a set of ...
Original Post Date: Monday, August 18, 2014 I had the distinct pleasure last week of attending the 2014 NASA Cost Symposium.  While to the uninitiated this might sound like a bit of a snoozer – it was actually quite interesting and proved to be the source of a ton of valuable information.  The event took place at Langley Research Center in Hampton, VA – near Williamsburg, Newport News, and not too far from Virginia Beach.  My participation was somewhat self-serving in that I was there to talk about PRICE’s new Space Missions Cost Models for TruePlanning®.  This model – discussed ...
Original Post Date: Friday, July 25, 2014 July 2014 marked the 25th anniversary of Neil Armstrong’s historic stroll on the moon.  If you go to the NASA website and select Missions you’ll probably be amazed at the number of missions in NASA's past, present, and future.  Unless you’re living under a rock, you know about the International Space Station, and the Hubble telescope but I’m guessing there’s a lot about space missions that many of us are unaware of.  The Dawn spacecraft, which was launched in 2007 from Cape Canaveral, was sent into Space to help NASA scientists learn about the history ...
Original Post Date: Thursday, July 17, 2014 Introduction Parametric cost estimates provide high quality, defendable estimates early in a project’s life cycle. This makes them ideal when producing bid and proposals. The nature of parametric cost estimates, however, requires the results of the estimate to be framed in terms of specific CERs and Activities and Resources. It is common for an organization to have a more granular set of Resources than the ones used to support the CERs. One approach to resolving this issue would be to use the TrueMapper application from PRICE Systems to map TruePlanning Resources to a more ...
Original Post Date: Thursday, July 10, 2014 Introduction TruePlanning provides a powerful and highly customizable reporting environment. Project data can be viewed from many different perspectives and those perspectives can be saved and reused across all projects. Results can be exported to Excel and Word as custom reports. There are, however, some instances where there is a need to get beyond the two axes used in TruePlanning’s reporting engine. This need is sometimes expressed by users preparing TruePlanning cost estimating project data for use in a bid or proposal. Perhaps the data needs to be split by phase and labor/non-labor over ...
Original Post Date: Tuesday, July 1, 2014 Whether you’re doing a software cost estimate to support a Bid and Proposal effort, a software valuation, should cost analysis, or to develop a detailed project plan, it is vitally important to understand the ‘size’ of the software you are estimating.  The problem with software size is that it tends to fall into the intangible realm of reality.  If you tell me you are building a widget that weighs 13 pounds, I can really start to get my head around the task at hand.  If I’m chatting about this with my European colleagues, ...
Original Post Date: Friday, June 20, 2014 Proposal estimates based on grassroots engineering judgment are necessary to achieve company buy-in, but often are not convincing or not in sync with the price-to-win.  This contention can be resolved through by comparing the grassroots estimate to an estimate developed using data driven parametric techniques.  Parametric estimates apply statistical relationships to project data to determine likely costs for a project.  Of course, for a parametric model to properly support this cross check of the grassroots estimate, the proper data must be fed into the model.  This most likely requires the estimator to reach ...
Original Post Date: Wednesday, April 2, 2014 Introduction Parametric estimates provide reliable, reproducible, and flexible views into cost and effort so it’s only natural to want to include this data in a bid and proposal workflow. With TruePlanning 2014 big steps have been taken to make such integration seamless and easily reproducible.  New tools in the TruePlanning suite of products, as well as, integrations with some of the major bid and proposal software applications are at the heart of this new feature set. You can learn more about TruePlanning 2014 and the PRICE cost estimation models at our website, but let's ...
Original Post Date: Thursday, March 20, 2014 Here’s a conundrum.  You are a software estimator responsible for helping the decision makers in your company determine what business to pursue and what business to steer clear of.  You know, that to win profitable business, your company first needs to decide which opportunities are golden and which should be avoided.  You also know, that at the point at which this decision needs to be made, there is very little information available to support a quality estimate.  Add to this the fact that software estimation is hard   at almost any stage.  What’s ...
Original Post Date: Thursday, March 20, 2014 One of the complications in generating Bids and Proposals for Modules and Microcircuits is determining the “Should Cost” for better cost realism. Most of the electronic modules and their components in the proposals are not actually manufactured by the Proposer, but rather by a subcontractor, thus becoming a Purchased item. It is difficult to determine the cost of making the Module, and determining a fair cost. Costs for the modules include Assembly and Test costs together with the component costs. Components such as ASIC’s (Application Specific Integrated Circuits), have both the cost of developing the devices and ...
Original Post Date: Monday, December 30, 2013 Unless you live under a rock, you are aware of the healthcare.gov rollout disaster.  While similar IT failures are regularly in the news, the high profile of healthcare.gov has really mainstreamed awareness of the fragility of many IT projects.  Check out this article entitled ‘The Worst IT project disasters of 2013’.  It details IT project failures such as:  IBM’s failure to deliver on a payroll system project that could potentially cost taxpayers up to $1.1 Billion dollars US.    SAP’s failure to deliver satisfactorily on requirements for ...
Original Post Date: Tuesday, December 3, 2013 Forrester defines big data as “the techniques and technologies that make capturing value from data at extreme scales economical”.   Wikipedia defines it as “a collection of data sets so large and complex that it becomes difficult to process using on-hand database management tools or traditional data processing applications.  The challenges include capture, curation, storage, search, sharing, analysis and visualization”.  Many use the 3Vs to describe the characteristics of big data – Volume, Variety and Velocity.   Basically Big Data refers to number crunching of epic proportion, accomplishing in minutes what may have ...
Original Post Date: Friday, October 4, 2013 In my "Work Breakdown Structures are Workable!" blog, we reviewed the subtleties of using the [object] tag to your advantage in creating different sorts and roll up subtotals. As a followup, I’d like to drill down a bit on the initial step of using the “copy grid” exports. Each row number is unique, thus creating an identifying key for the vlookup function in Excel. Since all object X activity instances are allocated 100% to one of the three phases (with very rare exception), these row keys allow you to sort and re-group ...
Original Post Date: Friday, October 4, 2013 True Planning results have many options, including viewing Costs by Activity. While simple, this view can be quite powerful, especially when exported for re-organization manipulation. In a recent exercise, the WBS mapping of common objects, estimated by separate multiple scenarios, presented a non-trivial chore in Excel. “Transposition” features work fine for matrices, as do pivot tables. But how does one map object by activity grids into activity lists, similar to MIL-STD 881a, with singular “roll up” instances of all nonzero object costs? The secret is in how True Planning appends each activity output with the ...
Original Post Date: Friday, October 4, 2013 The late Norm Crosby’s “Quality is Free” taught us that an investment into quality is more than offset by prevention of defects based upon understanding of requirements. Only with the latter can lack of conformance (and subsequent costs) be captured and hence quantified towards quality. So how then is Parametrics relevant? Parametric estimating is more than cost modeling. Our craft represents an initial consulting function into the accuracy and completeness of program planning concepts. Our customers trust us to know when to ask and when to supplement. Yes, we are mathematical and financial modelers ...
Original Post Date: Friday, October 4, 2013 In his August 2010 blog-entry, Zac Jasnoff outlined typical client perspectives for the different types of analyses that True Planning can accommodate.  Working on a large project, we’ve experienced situations that, realistically, can happen where the initial intent... and model structuring… later has the boundaries of model appropriateness stretched.  An AoA, for example, is meant to measure deltas between baseline and its alternatives.  If common costs “wash” then they can be excluded… which becomes an issue when treated as a Rough Order Magnitude for customer budgeting. Likewise, if a ROM or ICE of ...
Original Post Date: Friday, October 4, 2013 I’m not a golfer. But we’ve all heard one say “that’s why I play” after hitting a shot and feeling like it all came together. What “it” is, in terms of mechanics and timing, I’m not really sure. In our own world of parametrics, it’s the feeling of adding value in that golden moment of facilitating decisions and forward momentum. We wear many hats: estimating, consulting, systems engineering...even cost accounting. Building an AoA, ICE or ROM is where rubber-meets-the-road in regards to configurations and assumptions. Not too long ago I was in a discussion with ...
Original Post Date: Friday, October 4, 2013 In Parametrics is Free, I acknowledged receiving (too late) “you should’ve known to ask that” over the years. Quality control after-the-fact is fine; but it’s better and cheaper to take a systematic approach to quality assurance as part of your estimating process. The sheer volume of what we model can often keep us so close to the details that we are unable to step back and put on our QA hat on for a sanity check. Enter Quality! On a very large project, our team has introduced a few regular cross-checks, notwithstanding typical ...
Original Post Date: Friday, October 4, 2013 Like titanium and other exotic metal-materials, “composites” (by definition, combinations of materials) offer significant weight-savings and reduced part counts, but at a price of high production cost. Sound contrarian to our parametric cost estimating view? Not really. Complexity of manufacture is quite higher. Likewise process index and structural tooling values grow. Plus, design lead times drive developmental cycles. That said, understand that composites represent more than a material type. They can involve a highly labor-intensive approach to preparing, braiding/ winding, molding, bonding and modular assemblage. Yes, some aspects of braiding and molding lend themselves ...
Original Post Date: Friday, October 4, 2013 Parametric modeling is excellent for all aspects of early-concept cost estimation, including go/no-go decisions downstream. So, in the spirit of bringing a transparency to (ethical) financial engineering… why not apply our craft to pricing “real-options”? The latter are essentially strategic opportunities for engaging resources (cost/schedule) into projects, ventures, investments, or even abandonments. The opportunity choice has value itself! Unlike static project Net Present Value (often, but not exclusively, approximated with Discounted Cash Flow) assuming pre-defined decisions, real-options reflect the merit of flexibility. If an R&D or proof-of-concept presents viability/marketability learning, the option has positive ...
Original Post Date: Friday, October 4, 2013 Ahhhh, the 80s… a challenging (but often confusing) time in an evolving computing world. Working in 1985 as a software estimator as well as SQA engineer in a quality assurance department that “audited” real-time projects using new concepts like OOD & OOP… well, you get the picture. It was a great time to get immersed into great work. And the good news: that company’s process as well as its developers were bullish on a young estimation/ quality type asking plenty of questions… as long as they were of the Yes-No variety. And ask ...
Original Post Date: Friday, October 4, 2013 ...wear the worst shoes. The cobbler was a master at his craft; he was just too tired to practice it when he got home from the shop. Sound familiar? A disciplined approach to understanding (functional) requirements as well as analogous projects (with actuals) is our not-so-secret sauce. Why run the risk of creeping back up our career learning curve? There’s already enough scope creep to keep us busy. Plus, for you management types charged with prospecting, a consistent approach towards estimation is a great way to connect with people who've felt the pain of ...
Original Post Date: Friday, October 4, 201 My "Real Options Valuation" blog suggested the use of parametrics in real options valuation. I’d like to offer the generalized use of our type of modeling in valuing tangible assets. Typically, fundamental analysis evaluates the intrinsic value of securities. I won’t attempt to compete with Warren Buffet here. But it is certainly the case that a company, or portfolio of securities reflecting many companies, is based in part on the market value of its product assets and their potential for future earnings, as well as other objective and subjective considerations. In parametric estimation, ...
Original Post Date: Wednesday, September 25, 2013 The “Systems Folder” cost object which is found at the start of every TruePlanning Project is most often confused with the “Folder” icon. These two however should not be confused. The “Folder” icon does not have an input sheet at all. It is not a cost object and contains no cost estimating logic or relationships.  It is provided as a collection point so that cost objects can be grouped in ways for clarity like to separate out phases of the acquisition lifecycle or to divide costs between subcontractors, etc.  Whereas, the “System Folder” contains all ...
Original Post Date: Wednesday, September 25, 2013 We may all agree that risk analysis is a necessary, vital part of any valid/defensible cost estimate.  We may not agree as much on the best approach to take to quantify risk in an estimate.  All estimates contain risk.  In the words of a wise cost estimator I know, “That’s why they’re called estimates, and not exactimates!”  We must quantify and manage levels of risk.  Why?  One vital part of a successful program is the ability to build a budget based on reliable cost projections.  Reliability increases when we can analyze inherent risk, ...
Original Post Date: Wednesday, September 25, 2013 A lot of clients have been expressing interest in modeling ASICs, FPGAs, and various other electronic modules inside TruePlanning® (TP). In the release of TruePlanning® 2014 there will now be the capability to model all these products inside our framework. Not only will you be able to model these products but you will of course be able to model the integration cost of these electronic components with Hardware and Software components. In addition you would be able to add and estimate the program management of your total project through our integrated framework. TruePlanning Microcircuits ...
Original Post Date: Wednesday, September 25, 2013 In Government contracting all contracts are made up of a network of suppliers. The Prime contractor who won the overall bid usually has a supply chain of vendors from whom they receive their products and services. In addition they have Subcontractors who provide services under a contracted agreement of work. These vendors and subcontractors most likely have their own network of suppliers which allows for a cost-effective supply chain that extends across America and to other nations. Vendors sell identical or similar products to different customers as part of their regular operations. These ...
Original Post Date: Wednesday, September 25, 2013 “Integration Factors – What Makes TruePlanning™ Stand Out in the Crowd” In today’s world, system integration is becoming more and more important. The government has started asking for designs that have additional capabilities, which allow connectivity both with systems under construction and systems already in use and deployed. The reason systems integration is important is because it adds value to the system by adding abilities that are now possible because of new interactions between subsystems. In a recently posted article on “The True Costs of Integration” the writer defined the costs of a typical integration ...
Original Post Date: Wednesday, September 25, 2013 Introduction/Problem Statement A current client has expressed interest in the default values on the simple vs. the detailed input sheet. More specifically the question arose because this particular customer as well as others had a misconception about the simple vs. the detailed input sheet default values. Most users did not realize that if they were only inputting values on the simple input sheet that the detailed input sheet default values were still be used in the calculation for their cost estimate. So the question became how much are each of these default value inputs ...
Original Post Date: Wednesday, September 25, 2013 These days bidding can be a game, and contractor leadership is constantly making decisions on whether to take on risk in order to stay competitive or to bid conservatively for the safety of not overrunning.  You may complete a cost model for a program, and spend time analyzing the uncertainties behind each input and in the end find that your estimate lands at the 30% confidence level.  After some strategic analysis, the bid leadership team decides, we would like to bid at the 80% Confidence level, “please present your estimate to support that total”.  ...
Original Post Date: Wednesday, September 25, 2013 Background: FPGA design typically uses a library of Tiles (Sets of gates and transistors) and a CAD system to physically lay out the actual active devices on an empty portion of substrate. These devices are interconnected by physical (usually copper) traces to route the signals and perform the desired tasks. The design may be totally customized for a single set of functions and may not need any form of programming. Other designs may allow some parts of the device to be electronically re-programmed to allow the device to be calibrated or adjusted for ...
Original Post Date: Wednesday, September 25, 2013 A current consulting client has expressed interest in modeling scheduled overhauls, above and beyond scheduled maintenance. The latter is well-addressed by TruePlanning. Our challenge is to utilize the model and its calculated lifecycle metrics to estimate the former as well. We have recently developed an approach to address this specific need that I’d like to share with you here. Terminology is important to level-set.  Notwithstanding replacement procurement, restoring equipment readiness via operations & maintenance activities usually falls into one of three funded categories[1]:  Inspection/ Repair at the organizational and/or direct support ...
Original Post Date: Wednesday, September 25, 2013 As promised in my last blog (“System and Assembly Objects in the context of Hardware AoAs”), the integration of Software COTS is a subtly different challenge.  Customers are typically presented with two scenarios:  integration within a single system and integration within a system of systems (“SoS”).  Both cases are handled well by TruePlanning™ with specific parameter choices that control multiple activities relevant to software integration.  As I’ve come to appreciate PRICE’s competitive advantage with our Framework’s approach to the latter SoS case, I thought describing these two scenarios would help allow you to ...
Original Post Date: Wednesday, September 25, 2013 During a recent Analysis of Alternatives (“AoA”) consulting project, our customer asked that we provide more insight into TruePlanning’s System and Assembly objects, which in our AoA context we termed Systems Engineering/ Program Management (SE/PM) and Alternative Integration/ Test, respectively. The customer’s challenge was understanding our parametric model’s treatment of principally hardware-COTS objects, combined with other cost, purchased service and training objects.Our Chief Scientist, Arlene Minkiewicz, provided us with insights that I’d like to share with you, as well as my views on how we at PRICE systems have consistently used these parent ...
Original Post Date: Wednesday, September 25, 2013 It is impossible to find a news website or magazine that is not full of articles on the effects of Sequestration.  As a cost estimator, I find the topic very interesting (and troublesome).  The immediate effects of Sequestration are widely discussed.  However, I do not see quite as much news coverage on the second and third order effects of this extremely complex policy decision. The Department of Defense (DoD) has a specific target that must be removed from the budget over the next 10 years.  Some analysts claim a doomsday scenario.  Others claim it ...
Original Post Date: Wednesday, September 25, 2013 In a recent National Public Radio (NPR) interview, Winslow Wheeler (Director of the Straus Military Reform Project of the Project on Government Oversight in Washington, D.C.), spoke on the recent problems with the Joint Strike Fighter acquisition process.  “Wheeler explained that Lockheed Martin, the manufacturer of the jet, uses a pricing vocabulary that masks costs. ‘Flyaway costs, non-recurring and recurring costs, and lots of gobbledygook, and they’ll say that comes to a number like $60-$70 million dollars. And, it’s complete baloney,’ said Wheeler.” (pogo.org)    The F-35 has the distinction of being the most ...
Original Post Date: Wednesday, September 25, 2013 “On 29 July 2003, the Acting Under Secretary of Defense (Acquisition, Technology and Logistics) signed a policy memorandum entitled “Policy for Unique Identification (UID) of Tangible Items – New Equipment, Major Modifications, and Reprocurements of Equipment and Spares”. This Policy made UID a mandatory DoD requirement on all new equipment and materiel delivered pursuant to solicitations issued on or after January 1, 2004. USD(AT&L) issued verbal guidance that tangible assets manufactured by DoD’s organic depots were to be considered “new” items which fall under UID marking policy, beginning 1 January, 2005. An item is considered “significant”, and will be uniquely ...
Original Post Date: Wednesday, September 25, 2013 What follows is PRICE's interpretation of how to model FPGA’s and ASICs inside our current version of TruePlanning® 2012 SR2 and older versions. ASICs are an application-specific integrated circuit, customized for a particular use. FPGAs are Field-programmable gate arrays and are considered the modern-day technology that can be used in many different applications because of the programmable logical blocks. In industry ASICs has grown from 5,000 gates to over 100 million. Designers of digital ASICs use a VHDL language for the functionality.   Estimating Best Practice for TruePlanning: Model your electronics at the board level ...
Original Post Date: Wednesday, September 25, 2013 Last November I hosted a webinar that discussed the use of the companion applications Live! This session helped to further explain how to use them in congruence with TP, the history, & why we created them, etc. During the presentation I showcased the success’s I have encountered with using them both in the recent AOA I described in part 2 and this blog part 3. You can find the recorded webinar on our site. In addition I described as I am going to do here the differences between the large project engine and the excel ...
Original Post Date: Wednesday, September 25, 2013 Every day we use tools like TruePlanning to build up detailed parametric cost estimates.  We could spend weeks collecting data and design information, and weeks honing details on risks and uncertainties.  When we finally get to reasonable point estimate, or even a distribution of probable estimates, there are always more questions.  Of course the range of queries depends on the purpose of the estimate, and who your consumer is.  If you are preparing an estimate for a competitive proposal, a company executive may be your consumer.  They may want to know, “What is the ...
Original Post Date: Wednesday, September 25, 2013 After we spend time building up a point estimate, a cost estimator always has to do some additional work to break the estimate down into terms their estimate consumer will understand, or to perform different comparisons for various analyses.  Sometimes, you need to map the estimate into a Standard Work Breakdown Structure or Cost Element Structure.  Sometimes you want to compare to a bottom-up or grass-roots estimate.  Or, if you are planning budgets or manpower out into the future, you need details.  We have to speak a lot of languages in order to ...
Original Post Date: Wednesday, September 25, 2013 I’ve recently had a number of users ask, “How do I model life cycle costs for a missile that just sits on a shelf?”  I had never actually tried to model this, but of course I know it’s possible.  So I turned to some of my fellow PRICE experts, and found that of course this is not the first time anyone has ever tried to model this kind of thing… Many ordnance weapons such as mortar shells, torpedoes, bombs, missiles and various projectiles are stockpiled until they are actually needed. These weapons ...
Original Post Date: Wednesday, September 25, 2013 We have been blogging a lot lately about finding the right manufacturing complexities for your hardware  projects using TrueFindings, and the many ways you can determine the most appropriate Manufacturing complexities: taking averages from related projects, finding a correlation to another descriptive parameter (like horsepower for engines), or even looking into multi-variate regression to determine the best value.  If you are a TruePlanning user, you know that Manufacturing Complexities (for structure or electronics) are the most common parameter to use in calibration, and that the manufacturing complexity drives both Production and development ...
Original Post Date: Tuesday, September 24, 201 Risk Analysis Methodology Overview – Method of Moments In this second of three articles on risk analysis, we will discuss Method of Moments.  Method of Moments (MoM) is an alternative to Monte Carlo simulation.  Along with the methodology, we will present some pros and cons of using MoM over Monte Carlo.  What is a moment? Before we discuss the methodology behind MoM, we first need to talk about moments.  Caution:  for all the master statisticians out there, this article is meant to boil down complex topics in an easy to understand manner.  There are obviously ...