• Predictive Analytics for Improved Cost Management  



Blog



Original Post Date: Monday, July 2, 2012  The COSMIC method for counting function points arose out of concerns that the IFPUG (NESMA, FisMA) function points are too concerned with data intense business systems and subsequently are not adequate for adequately measuring the size of real time systems.   The COSMIC function point counting method has been designed to be applicable to both business systems such as banking, insurance, etc and real time software such as telephone exchanges and embedded systems such as those found in automobiles and aircraft.  The COSMIC method uses the Functional User Requirements as the basis for the ...
Original Post Date: Monday, June 18, 2012 This week I’m attending the Better Software Conference in Vegas.   I just attended a great keynote given by Patrick Copeland of Google.  The topic was innovation.  He talked about how innovators beat ideas, prototypes beat prototypes and data beats opinions.  These sentiments are all part of the pretotyping manifesto. He started with the truth that most new products and services fail and proposed that while this is not unexpected, there is a good way to fail and a bad way to fail.  The good way to fail is to fail fast.  ...
Original Post Date: Tuesday, June 5, 2012 Ever wonder what programming languages are the most productive?  I recently did a little research into this topic using the International Software Benchmark Standards Group (ISBSG) database. The database contains over 5000 data points with size and effort data for projects from a wide variety of industries, applications and counties.  Of course not all 5000 data points were suitable for my investigation.  The software size is measured using functional size metrics but the database accepts projects that use various counting methods.  I narrowed my search to projects that used the International Function Points ...
Original Post Date: Friday, March 23, 2012 Recently I have been playing around with the International Software Benchmark Standards (ISBSG) database for Development and Enhancement projects.  And in the interest of full disclosure I admit that I am more than a little excited to have close to 6000 data points at my fingertips.  I will further admit that there’s something quite daunting about having this much data; where to start, what should I be looking for, how can I best use this data to offer some useful guidance to inform software cost estimation.  For those of you not familiar ...
Original Post Date: Friday, February 24, 2012 When software developers first starting writing programs for the Windows ® Operating System it wasn’t pretty.   Everything had to be done from scratch – there was no easy access to tools, libraries and drivers to facilitate development.  A similar tale can be told by the earliest web site developers.   A web application framework is an SDK (Software Development Kit) for web developers.  It is intended to support the development of web services, web applications and dynamic websites.  The framework is intended to increase web development productivity by offering libraries of functionality common ...
Original Post Date: Thursday, February 9, 2012  Model Driven Engineering is a software development methodology focused on creating domain models that abstract the business knowledge and processes of an application domain.  Domain models allow the engineer to pursue a solution to a business problem without considering the eventual platform and implementation technology.  Model Driven Development is a paradigm within Model Driven Engineering that uses models as a primary artifact of the development process using automation to go from models to actual implementation.  Model Driver Architecture is an approach for developing software within the Model Driven Development paradigm.  It was ...
Original Post Date: Thursday, December 15, 2011 This week CAST released their second annual CRASH (CAST Report on Application Software Health) Report.   The summary findings can be found here . You will also find a link to the Executive Summary.   The report highlights trends based on a static analysis of the code from 745 applications from 160 organizations.  The analysis is based on five structural Quality characteristics: security, performance, robustness, transferability and changeability.  Some of the more interesting findings include: * COBOL applications have higher security scores that other languages studied (meaning they have better security)  I personally found this finding ...
Original Post Date: Monday, December 12, 2011 Check out this paper “The Economics of Community Open Source Software Projects: An Empirical Analysis of Maintenance Effort.”  In it the authors hypothesize that Open Source practices increase the quality of the software that gets produced and subsequently lead to code that is less costly to maintain.  Low quality code must be refactored more frequently than high quality code and there is substantial evidence that maintenance interventions tend to lead to even more degradation of the quality of the code.  So not only are low quality applications more expensive to maintain, the unit ...
Original Post Date: Monday, November 28, 2011  Check out this blog post  on project estimation.  The author discusses the practice of ‘padding’ effort estimates and how destructive this practice can be to healthy project management.  She suggests that project team members, rather than padding their individual efforts at a task level, should collaborate with project management in order to produce a good solid project plan with sufficient contingency reserves.  This allows for the project plan to reflect the most likely case but contains a safety net for those cases where the stuff that was unknown at the time of project ...
Original Post Date: Tuesday, November 8, 2011  Last week I attended the 26th annual COCOMO forum.  This meeting is an interesting combination of conference and working group and for me it’s a great place to take the pulse of the software and systems estimating community.  Lots of times you’ll go to a conference like this and feel as though the same old things are repeated year after year.  Not so with this conference – it is always a great mix of seasoned practitioners and graduate students with both groups providing forward looking information and inspiration on a variety of ...
Original Post Date: Thursday, October 20, 2011  Check out this article.   “Why IT Projects May be Riskier than you Think”.  If you read through the comments you will see that this article truly resonates with many in the field.  In the article the authors discuss research of over 1471 IT projects (large projects with an average cost of $167 million) comparing budgets and expected performance with actual costs and results.  Their results were surprising in that the average overrun was only 27%.  Turns out that the average isn’t what requires study but rather the outliers.  The study found that ...
Original Post Date: Monday, October 3, 2011 Here’s an interesting article “Technical Debt as Metaphor for Future Cost” ().  In this the author discusses the acceptability of using the metaphor of technical debt to facilitate communications between business leaders and the software team when negotiating around the triangle  (time, money, scope).   And while the  author accepts the use of this metaphor good “short-hand” for communicating the fact that avoiding the work now is not sparing the cost but just rearranging the way the costs are incurred – and often increasing the overall costs that need to be spent.  The ...
Original Post Date: Friday, September 9, 2011 I have recently being following an animated thread on LinkedIn “Death of a Metaphor – Technical Debt.” It’s been live for 2 months with over 200 contributions from dozens of different people.  The discussion was launched by questioning whether continued use of this metaphor makes sense.  The discussion thread weaves and bobs around actually answering this question but it’s amazing how passionate the world is on this topic.  My personal opinion is that it’s a perfectly adequate metaphor because it helps create a discussion between IT and the business leaders in terms ...
Original Post Date: Friday, September 2, 2011  The IEEE published “Top 11 Technologies of the Decade” in the  January 2011 editions of the IEEE Spectrum magazine.  It should come to a surprise to no one that the Smartphone was number 1 on this list.  The answer to the author’s question “Is your phone smarter than a fifth grader” was a resounding YES![1]   In 1983 Motorola introduced the first hand hell cellular phone.  It weighed in at two and a half pounds, had memory capacity for 30 phone numbers, took 10 hours to recharge and had a selling price of $4000 ...
Original Post Date: Thursday, August 25, 2011  Check out this Report on Big Data from McKinsey & Company published in June 2011.  Back in the day, when personal computers were becoming widely accepted and Microsoft Windows© was the new cool thing, SneakerNet was a common means of sharing data.   Certainly the introduction and subsequent improvements of networking technology and the Internet have made data sharing a whole lot easier and quicker.  But the concept of Big Data creates a whole new level of opportunity and potential for collecting and using data in ways heretofore unthinkable. So what is Big Data?   According ...
Original Post Date: Wednesday, July 27, 2011 Open Source software is software that is distributed publicly with all of its source code.  Users of open source software are encouraged to review the source code, make changes to it and share those changes with the rest of the user community.  The value in open source is that providing the source code to the user community allows those in the community who are willing and able to make improvements, add features, and fix bugs. Open source takes the notion of peer review to the next level.   It means that instead of ...
Original Post Date: Tuesday, July 12, 2011 Check out this article from CIO magazine about managing your project budget.   The author, Jason Westland, suggests four things necessary to maintain control of your project budget.  While these are not earth shattering suggestions, sometimes project managers in the throes of a project can lose sight of their importance.  The strategies are: * Continually forecast the budget * Regularly forecast resource usage * Keep the team informed * Manage scope meticulously Or to put it another way – respect and revisit the Project Management Triangle.  (To learn more about the Project Management Triangle go to this ...
Original Post Date: Monday, July 11, 2011 There is a ton of code out there and we’re constantly adding more.  Gartner reported in 2009 that there were more than 310 billion lines of code in use worldwide. MS Word has grown from 27,000 lines of code in the first version to about 2 million in 2010.  The Windows Operating System grew from 3 million lines of code in 1992 to 20 Million in 1998.  You get the point – there’s lots of code out there doing lots of the same things that we may want our software to do. One ...
Original Post Date: Friday, June 10, 2011 I’m on my way home from the ISPA/SCEA (International Society of Parametric Analysts, Society of Cost Estimating and Analysis) Conference held in Albuquerque this week.  Attendance was very good (2nd best in the conferences history) and as the content seemed especially good this week.  I attended lots of good talks on topics ranging from SEPM (System Engineering, Project Management) cost estimating, Joint Confidence Levels, Software Estimating, Affordability,  Agile software development and estimating for Enterprise Resource Planning Systems.   Of course, just because the topics are good and well presented doesn’t mean I have ...
Original Post Date: Friday, June 3, 2011  If I Google the phrase “cloud computing” I get about 49,900,000 hits.  That’s a lot of hits – more than 10 times the hits I get if I Google “service oriented architecture.”  This made me think that cloud computing is an area I needed to learn more about. So what are we really talking about when we talk about cloud computing?  “The cloud” is a generally accepted euphemism for the Internet.  End users access computing assets from the cloud using a model similar to one that homes and offices use to get electricity ...
Original Post Date: Wednesday, May 18, 2011 This week I am attending the Systems and Software Technology Conference 2011 in Salt Lake City.  I've been a regular at this conference for the last 20 years.  While attendance has declined, the conference continues to deliver quality content for developers and acquirers of software and software intensive systems.  The keynote was delivered by this year’s recipients of the prestigious Wayne Stevens Award.  Barry Boehm, one of the recipients was well known to everyone in the room and the software community.  He gave a great presentation reviewing his technology predictions from a paper presented in 2006 and offered predictions for 2011 ...
Original Post Date: Tuesday, April 5, 2011  In 1961 at the MIT Centennial, John McCarthy opined “if computers of the kind I have advocated become the computers of the future, then computing may someday be organized as a public utility just as the telephone system is a public utility…. the computer utility could become the basis of a new and important industry”  [1].  In 2006, Amazon Web Services was launched providing computing on a utility basis.  Since that time the notion of cloud computing has been emerging and evolving. Cloud computing is a paradigm that makes the notion of utility ...
Original Post Date: Monday, March 28, 2011  So how did your basketball picks go this season?  My bracket is officially closed since absolutely no one picked any of the final four teams!   I am happy to report that I came in second with a whopping 36 correct picks - picks that most would judge to be pretty bad.  So where did we go wrong? Since I don’t really follow college basketball closely during the year I make my picks somewhat randomly – loosely based on the teams' standing but occasionally predicting an upset.  Naturally, the upsets I predicted were not ...
Original Post Date: Tuesday, March 1, 2011 The concept of the fuel cell was first published in 1938 by Christian Friedrich Schonbein.  Based on this publication Sir William Grove invented the precursor of the fuel cell in 1839. The Grove Cell created current by applying two acids to zinc and platinum electrodes separated by a porous ceramic pot.  In 1842 Grove developed the first actual fuel cell which produced electricity with hydrogen and oxygen, much like many fuel cells in use today. Fuel cells remained an intellectual curiosity until the 1960’s when the US space program identified a requirement for ...
Original Post Date: Friday, February 11, 2011 The DoD Cost Analysis Symposium (DODCAS 2011) is next week, Feb 15-18.  I’ll be there along with several of my colleagues at PRICE Systems.  This conference consistently provides an excellent source of information and shared experiences for the acquisition community and I am anxious to attend again this year.  Last year the conference occurred shortly after Congress passed the Weapons System Acquisition Reform Act of 2009 (WSARA) - and the majority of the sessions were focused on discussions about how the services, contractors and the government leadership planned on dealing with this new law.  From ...
Original Post Date: Monday, January 17, 2011 While I don’t like to admit to visiting a website entitled geekArticles.com, I did stumble across a reprint of an essay by Grant Rule “Bees and the Art of Estimating”  that some of you may find interesting and instructive.  The author participates in his own form of “Estimation Trivia” by posing the following challenge “Take paper and pencil and write your estimate for the number of insects in the average hive of English honeybees.”  Of the approximately 1100 software measurement and process improvement professionals he has challenged thusly,  only about 10 have ...
Original Post Date: Thursday, December 23, 2010 A current research interest of mine is fuel cells – where they are being used and what it costs to manufacture fuel cell systems.  I thought I would share some of what I’ve learned to date. A fuel cell is an electrochemical cell which converts some fuel, usually hydrogen, into electric current.  It does this through a reaction between the fuel and an oxidant in the presence of an electrolyte.  The waste product of this chemical process is water and heat.  Fuel cells, unlike conventional batteries, consume reactant from an external source ...
Original Post Date: Monday, November 15, 2010 Last week I attended the 25th International Forum on COCOMO and Systems/Software Cost Modeling.  I attended for several reasons.  First of all, I was invited to participate on a panel whose topic was “25 years of Software Estimation: Lessons Learned, Challenges and Opportunities”.  Secondly, I have attended in the past and while it’s generally a small group, as such conferences go, I always come away impressed by the fact that so many smart people end up in one room and this year was no different.   But I digress; I really wanted to share ...
Original Post Date: Wednesday, October 27, 2010 Here’s a cool project.  The Bloodhound Project  is focused on building a land vehicle capable of breaking the 1000mph speed barrier.  The mission of the project is twofold.  The first is to “overcome the impossible using science, technology, engineering and mathematics”.  But the second is more interesting – this project is intended as motivation for the upcoming generation to embrace technology related fields.  Britain doesn’t have enough students interested in such fields and they are worried about their ability to compete in technological forays going forward. But how much should something like this ...
Original Post Date: Tuesday, October 12, 2010 National Boss Day is quickly approaching! While October 16th is the actual day this year it will be observed on Oct 15th since the 16th falls on a Saturday and what boss wants to hear from his or her employees on a day off even to be showered with cards, flowers and accolades.  According to Barry Wood, Boss Day was started in 1958 when Patricia Bays Haroski of Deerfield Ill registered it as a special date with the US Chamber of Congress to honor her boss (who was also her father).  October ...
Original Post Date: Monday, September 13, 2010  It's been brought to my attention that a post on hint fiction and hint project management, without a real example, is incomplete and unsatisfying.  To address this I have tied hint fiction to hint project management with the following story entitled "Another Day at the Office". Project problems abound; delays, turnover, scope creep.  Management concerns are palpable. Estimation exercise supports successful scope, schedule, cost negotiation.  Another rabbit out of the hat.
Original Post Date: Wednesday, September 1, 2010 Because I have enrolled in several on-line fiction writing workshops, I regularly receive newsletters about upcoming events in the world of fiction writing.  Several weeks ago I was quite intrigued when I received an invitation to enter a ‘Hint Fiction’ writing contest.  Here I don’t even know what hint fiction is and someone thinks I might be good enough at it to enter a contest – who knew?    Naturally, I Googled hint fiction (how did we get by without Google?) and found out that it is  “a story of 25 words or ...
Original Post Date: Friday, August 13, 2010 If you want to read an interesting article on EVM – check out ‘The Three Deadly Sins of EVM’  by Mike Mullaly.  In it he reflects some of my personal feelings about EVM but he does this much more eloquently than ‘it’s a crock’.  OK – while I have actually said that out loud – it’s probably a little too strong.  I do think that EVM may be a good tool to have in the toolbox – it’s just not the project panacea that so many make it out to be.  And it ...
Original Post Date: Friday, July 30, 2010 Earlier this week I presented a webinar on the topic of SOA governance – specifically focused on making sure that organizations include SOA governance as they plan to deploy SOA capabilities.  As sometimes happens when I am giving a presentation (especially one I have given before), I was struck with somewhat of an epiphany as I was relaying the material on my slides.  In this case it was not really a new idea about the material, but more a deeper understanding of why this topic really is important. To be honest, when I first ...
Original Post Date: Thursday, July 8, 2010 Which came first the chicken or the egg?  We can look to Darwin for one theory, the Bible for another but at the end of the day – nobody really knows.  There can be no chicken without an egg, nor there be an egg with no chicken.  Thus we are left with a bit of a circuitous conundrum. Joint Confidence Level (JCL), NASA’s current best practice for program planning and management, also presents a circuitous conundrum.  When a program has a JCL of 70% this implies that there is a 70% confidence that ...
Original Post Date: Tuesday, June 15, 2010  I recently read a great paper by Glenn Buttes and Kent Linton, NASA’s Joint Confidence Level Paradox – A History of Denial.   In it, the authors present a  very detailed analysis of many failed NASA projects along with some compelling theories on why so many projects fail and what can be done going forward.  While I’m not here to summarize their findings – interested parties can hit the link above and learn for themselves, there was one extremely interesting jewel in this paper that I felt the need to share. The reason I ...
Original Post Date: Thursday, May 13, 2010  Earlier this week I conducted a webinar intended to make PRICE users aware of the Cost Research Services available to them as part of the license fee they pay to use PRICE products. I thought I would recap the highlights of this webinar for those of you who might have missed it. At PRICE we understand that cost estimating tools, while useful and valuable, do not always present the complete solution. Every single cost estimation projects presents new and unique challenges.  We think it's important that in addition to solid, time trusted cost estimating models, ...
Original Post Date: Friday, April 23, 2010 Software project failures coupled with rapidly changing business needs are forcing organizations to revisit the way they go about building software.  Agile development has emerged as one possible solution to the woes of the software industry.  Agile enthusiasts claim significant increases in productivity and quality, while detractors cite instances where the reverse is true.  It seems to me that probably both are right  - some of the time anyway.  Agile means many different things to different organization.  There is a long list of agile tenets but not every method of agile applies all ...
Original Post Date: lThursday, April 8, 2010 Despite the plethora of literature on Technology Readiness Levels (TRLs) it remains a difficult concept.  I thought I would share my interpretation. For most of us the concept of technology readiness is hard to grasp. This is because in general, our experiences with technology are with fully matured technology. In 1961, President Kennedy challenged US scientists, mathematicians and engineers when he announced that within the decade of the 1960s the US would ‘land a man on the Moon, and return him safely to Earth’. At the time, there were no solutions to solve ...
Original Post Date: Friday, January 15, 2010 Failed software projects are always bad but there are additional complications when there is a contract in place to deliver the software.  Disputes over failed software can result in costly litigation that generally damages both the vendor and the buyer. According to observations of Capers Jones in "Conflict and Litigation Between Software Clients and Developers" (2007) , 5% of the projects his clients were involved in either had litigation pending or were currently involved in litigation over project failures.  His findings indicate that it is very large projects, over 10,000 Function Points that ...
Original Post Date: Wednesday, September 30, 2009 Recently I was interviewed by Doug Beizer of Federal  Computer Weekly for an article about the shift of government agencies away from custom software development and towards the use of cloud computing.  The interest in this topic seemed to stem from the introduction of Apps.gov online store earlier this month.   Having been in the software cost estimation community for more than 25 years, I have experienced this transition first hand but never really stopped to think about the whys and wherefores until questioned by Doug.  It was an interesting stroll down memory lane.  As an example, ...
Original Post Date: Monday, July 20, 2009  Did you know that according to kgb a single Google search takes 0.2g of Carbon Dioxide? Asking Google 2 questions is equivalent to boiling a tea kettle full of water.  If there were 2 billion Google searches a day in 2008, today we're looking at more than 400 Million g of Carbon Dioxide a day just for Google searches.  A part of my job at PRICE is to look into emerging trends and technologies to determine if and how changes in the world impact the costs of hardware, software and information technology projects.  ...
Original Post Date: Thursday, July 16, 2009 In an article in last weeks Harvard Business , IT Costs: Do You Speak Their Language), John Sviokla discusses the fact that as the information business continues to grow it is increasingly important for organizations to understand the impact of IT as it relates to their operating costs. This certainly rings True to us here at PRICE Systems who have recognized this reality. TruePlanning 2009 has been developed by PRICE specifically to help organizations get their heads around the true costs of Information Technology. Application development projects can represent significant expense to an ...
Original Post Date: Tuesday, June 16, 2009  Bad project estimates lower profitability.  Despite this fact many business leaders don’t invest in improving their estimating capability, buying into the fatalistic myth that this is as good as it gets.  This is patently wrong.  Project portfolios are prioritized based on the total expected Return on Investment (ROI) of projects.  Investments in the wrong project based on bad estimates could lead to lost revenue or delay of net benefit. All around us we see reports of software projects which are over budget, delivered late or cancelled because they are taking too much time ...
Original Post Date: Thursday, April 30, 2009 It's finally Spring!  And along with the leaves on the trees, the beautiful flowers and the happy chirping birds.... it is once again Baseball Season.  Baseball season is a beautiful thing - and not just because, as a resident of South Jersey, my team is the 2008 World Champion Phillies.  I just love the game and everything about it.  I believe this is because with baseball the impossible becomes possible because anything can (and will) happen and with a good plan in place you can still be successful. I didn't always love baseball.  ...
Original Post Date: Thursday, April 2, 2009 I have to say that my foray into blogging has been an interesting one.  By definition, the Chief Scientist should be a nerdy sort of geek too high brow to pontificate on topics in such a pedestrian format.  Actually I kind of like it.  In part because I enjoy writing and I'm not picky about what I write - technical documents are OK but pontification works as well.  And in part because I know that in order to be a good writer in a particular genre one must read extensively from that ...
Original Post Date: Monday, March 23, 2009 Here’s a great article I happened upon while doing research for a paper I’m writing.  “Lessons Learned: IT’s Biggest Project Failures”  In this article we are treated to stories of IT projects that “first make people laugh and then” (hopefully) “make them think.”  As a long time student of the failed software project, I was neither surprise nor disappointed with the projects relayed.  The projects noted failed for reasons such as: Failure to perform a should-cost analysis before selecting a supplier Failure to recognize an unhealthy project before it ...
Original Post Date: Monday, March 9, 2009 The US Department of Defense (DOD) continues to be plagued with cost overruns on major weapons systems.  Last month Senators Carl Levin (D-Mich) and John McCain (R-Ariz) introduced the 2009 Weapon Systems Acquisition Reform Act intended to put measures in place to force the DOD to address the issues that cause overruns and schedule slippage.   Among other things,  this legislation would create the position of Director of Independent cost assessment for Major Defense Acquisition Programs (MDAPs) and require the DOD to perform trade-offs between cost, schedule and performance early in the program lifecycle. ...
Original Post Date: Friday, February 6, 2009 Last week I was asked to participate in Career Day at my son’s elementary school.  I was both honored and humbled.  Honored because the school felt that my career was something the children would be interested in and humbled because I was forced to concoct a story that would make cost estimating and analysis both understandable and interesting to children from kindergarten through grade eight.  Fortunately the format was such that I presented to each grade individually so at least I did not have to come up with one story to address ...
Original Post Date: Thursday, January 22, 2009  Like many others, I was astonished last Thursday by the images on my browser of those 155 extremely lucky people standing in the Hudson River.  And they certainly were very lucky last Thursday.  If you’re destined to fly on a flight bound for collision with birds, you want it to be piloted by a hero like Captain Sullenberger.  The incident made me think about what a hero is and how we all have the opportunities to be heroic in our chosen professions. According to Wikipedia, a hero refers to a character that, in ...