Here is a taste of the NASA Symposium presentation in August 2018…
Wikipedia offers a nice overview of the history of software engineering describing the stops and starts that have led us to where we are today.  In the beginning, the complexity of the applications being developed was almost overshadowed by the logistics of the actual implementation process. People didn’t have personal computers and feedback was not instantaneous – and it was impossible to make predictions about when a project would be complete. As technology and tools improved, software was used to solve increasingly complex problems so the logistics demurred to the solution as the bottleneck – and it was still almost impossible to make predictions about when a project would be complete. The ‘software crises’ describes a twenty year period from the mid 60’s to the mid 80’s during which the industry was beginning to understand issues around productivity, quality and good architecture. This ‘crises’ spurred projections of a plethora of ‘silver bullet’ solutions such as structured programming, Computer Aided Software Engineering (CASE) tools, objected oriented design and programming, standards, defined processes, formal methods, the Capability Maturity Model (CMM) and many others. All of these tools, techniques and methodologies have made improvements to the overall productivity and quality of software. Then along comes the Internet, significantly increasing the complexity of things that could be accomplished with software. In light of emerging technologies and growing complexity, the formal processes, methods and standards were viewed by many as an impediment to progress rather than supporting projects’ productivity and quality goals. Many smart software development professionals began to trend toward lightweight methodologies or what we currently refer to as agile development
In February 2001, a group of these smart software development professionals sat down to talk about these lightweight methodologies. The result of these discussions was the Agile Manifesto . This manifesto introduced the following tenets for software development:
We are uncovering better ways of developing software by doing it and helping others to do it. Through this
work we have come to value:
That is while there is value in the items on the right; we value the items on the left more
Agile development processes rely on experienced, highly skilled people communicating with clients and each other to deliver software that provides the clients with the most value for their money. This requires both developers and consumers of software to accept the reality that things will change over the course of the project and that the software that is eventually delivered may not be the same as what was envisioned when the project began.
At any given time, the agile software development team is only working on the feature ranked most valuable by the customer. Estimation is performed by the team and is only focused on the feature that is currently on deck. At the end of an iteration, the customer may review the implementation to date and may reprioritize remaining features based on the current state of the software or changes in their business or expectations. Estimating beyond the current iterations doesn’t really make agile sense.
Unfortunately, the fact that estimation doesn’t make sense for the agile team does not mean that it doesn’t make sense for the folks to ask for estimates at various milestones within a program. A long legacy of very hardware centric projects within NASA, confounds this concern for estimates based on the program requirements. And it is not an unreasonable request to see this forest for the trees view of a program. NASA needs an idea when their software will be delivered, and they need to be able to optimize resource allocations across many projects, etc. This of course begs question of how to perform an estimate for software when you really don’t know exactly what software you’re going to build.
Parametric techniques are particularly suited for this task, especially for an organization that has some historical data from previous agile projects. Parametric estimating models provide a repeatable framework through which an organization can study their past performance on similar projects and use what they learn to perform an estimate on the project at hand. Analysis of performance on past projects can inform decisions about productivity and other project drivers. This information can be used to drive the parametric algorithms to develop estimates for cost, effort and schedule.