An old friend that teaches at a top-25 business school was expressing some dismay that some undergraduates do not grasp how profoundly statistics presented in class will influence their life. Those that fail to be mindful of the principles are at a serious disadvantage. Now, one could harp about slot machines or scratch-off lottery tickets, but since these writings are engineering-career focused, I’ll present several specific examples of how overlooking statistics can bite one in a work setting like an engineering consulting firm.

Tools for estimating geothermal project output (GeothermEx)

Three common situations we’ll discuss are:

  1. Proposals and expected value
  2. Capacity factor
  3. Instrumentation uncertainty, design margins and analysis

Proposals and expected value

Ever hunted? Buck Fever is a common phenomenon in the North Woods. When you are on a hunt, and what you think is a target appears, some part of the lizard brain seems to go into reaction mode. The exhilaration about the opportunity can get so extreme that at least in Wisconsin more than a few cows, horses and people have been shot during deer season.

This spirit seems to permeate people engaged in business development as well, who may want to chase everything that appears with frenzied abandon. There may be an array of opportunities that come across one’s desk in the course of a week, and one must prioritize. Perhaps you would be faced with a set of potential projects you could propose on such as:

  1. Huge and splashy: an open tender for the design of a new massive project in an international “hot spot” hits the streets. If your firm wins this it could really put your name on the map. You would be competing with bidders from around the world.
  2. Big and boring: an existing client needs a design for a retrofit of an old plant with new equipment. You feel it is likely that at best there may be 1-2 other competitors, but you should have an edge given your previous proven relationship.
  3. Startup tech: an entrepreneur with a radical new idea wants your help refining the design. They assure you it will be the next bombshell innovation in the field.

As a favorite podcast states, “you can have anything but not everything.” In truth it seems Few believe that. If your firm is not busy at all, perhaps you have the bandwidth and the need to chase everything. If time is at a premium however, you need to allocate your resources more intelligently. A way to prioritize your targeting might be to view this through the lens of expected values, or the product of anticipated revenue times the probability of winning. One should consider costs to prepare a proposal as well, possibly with sample values as shown in the table following.

OpportunityProject Value% to WinExpected ValueCost to Propose
Huge/Splashy$4M10%$400,000$20,000
Big/Boring$1M40%$400,000$6,000
Startup Tech$10k70%$8,000$1,500

It is common to hold Go/No-Go meetings where these values, probabilities and priorities are assessed. Existing reliable clients deserve and receive top priority, and one sometimes has to point this out to new people brought in that want to chase instead, to the exclusion of others, bright shiny things like the Huge/Splashy projects. For some reason new potential partners seems to have more gloss than familiar ones.

Distortions into this process can be introduced with the assignment of the “% to win.” Everyone thinks their child is above average. In similar fashion overly optimistic assessments of your chances to win an opportunity distort the expected values and hence may misdirect your team’s efforts. If there are ten competitors for the Huge project a 50% chance may be delusional, and your odds may well be closer to zero than 10%.

Then take the small project. Engineers are gadgeteers by nature, so projects that might promise some new cutting edge tech development inherently draw them like moths to a flame. Sure, we all like advancing the industry. But we need to recognize that new startup firms often don’t really know what they want or need, and the experienced You may spend a lot of (free) time up front educating them and developing your proposal, relative to the size of the opportunity. Spending a couple days developing a proposal on a small job may wipe out any profit you might have even if you win, and your company likely ain’t the March of Dimes (2:10). Also, hopefully the tech is a great success, and you collect your modest and reasonable fee, but a consultant will not share in massive startup downstream upside like an owner will. So for the good of your firm ration your efforts on science projects; steady reliable work is also needed to keep the lights on.

In an environment where you need to prioritize your efforts chasing opportunities, consider expected values rationally. The math is simple but these principles are routinely disregarded, likely due to cases of Buck Fever.

Capacity Factor

The definition of capacity factor for a power plant is the annual amount of electricity generated (typically net MWh), divided by the annual output it would have delivered were it to operate at its design output for each hour of the year (8760 for non-leap years). A baseloaded geothermal or nuclear plant might have a capacity factor over 90%. Intermittent renewables such as PV solar or wind might deliver 20-40% depending on the site. Hydro plants might have a wide range, perhaps 20-90+% depending on seasonal water availability.

Your development of the plant design and its rated output, if conditions are perfect, is only one part of the puzzle. It’s one thing to develop a process model on a clean sheet that shows output Y for inputs X1, X2 etc. The achievable capacity factor depends on fuel availability, plant reliability and environmental conditions, among other factors, all of which will vary year to year. Actual achievable capacity factor and hence annual generation is what will drive revenue and make or break the project’s eventual success.

All too often we see banker types develop financial models that have simplistic and optimistic accounting for capacity factor. This is where you need to raise your grease-stained engineering paw and point out how potential variability in technical matters would influence the financial performance. Even a one degree ambient temperature uncertainty (or heaven forbid, increase) over a period of a decade or two could have impacts in the millions of dollars net present value…ignore at your peril.

As another angle, consider that Engineers seem to be drawn naturally towards maximizing efficiency, in part because we have umpteen ways to calculate it. Achieving this often requires increased plant complexity. If the new innovative subsystem you add breaks and the plant has to be taken down for repairs, or makes the project more sensitive to fuel or environmental variations, both negatively impacting capacity factor, your innovation may be undesirable.

Project teams need to make frank quantitative assessments of the potential benefits and risks of features on output and capacity factor, as well as other aspects such as safety and capital/operating costs. Up front, one has to make reasonable estimates to support the financial model in the feasibility study stage. If the project goes ahead, teams generally review the features of power plants periodically (say at the 30% and 60% of design progress stages) in a systematic fashion, using structured reviews like hazard and operability reviews (HAZOPs) or constructability reviews. To support these study efforts, we have to be able to soberly assess the probability of events and uncertainties in their impacts.

Instrumentation uncertainty, design margins and analysis

Here is a simple example from a recent investigation. The design flowrate of a set of pumps at a power plant was 75,000 gpm (about 17,000 m3/h for you metric types). A periodic plant audit using data from existing plant instrumentation and an indirect energy balance method to back calculate the flow came up with 69,000 gpm. A subsequent pump test using calibrated temporary test instruments calculated the pump flow as 73,000 gpm. The question is, are the pumps worn to the point where it would be worthwhile to replace them to restore flow, presumably increasing plant output and revenue?

Someone coming right out of university and faced first with the 69,000 gpm calculation might immediately conclude that there is a potential to increase flow by 8.7% (75,000/69,000 gpm) by upgrading the pumps. Yet…not so fast, chief. It is going to disappoint several of you when you walk into existing power plants, but many installed instruments are nonfunctional or at minimum have not been calibrated since the last time you filed your tax returns. If we take several readings of dubious accuracy (say 1-5% each), and then process them through a set of formulas to back calculate flow, then one must propagate the impact of those independent variable uncertainties to get an overall uncertainty for the calculated value. It is possible your perceived difference/potential for improvement/excitement could be nothing more than a statistical nothing-to-see here.

Uncertainty analysis is well explained in the excellent textbook Experimental Methods for Engineers by J.P. Holman. If you are not aware of the bounds of uncertainty with instruments and their impacts on calculated values, you may be chasing your tail often after false conclusions.

Let’s say then after the 69,000 gpm calculation you got the test firm’s report and conclude that the 73,000 gpm measurement with an attendant uncertainty of ~3% proves that there is really no perceptible opportunity for improvement of the pumps, considering the design flow is only 75,000 gpm. The difference might be statistically insignificant. That might be a more realistic assessment, and you might put down that project and move to something else.

Here’s a little sidebar with an alternative hypothesis though: while we might design the plant to nominal stated values, most of us (if wise) build in some additional equipment margin, because if the plant doesn’t make performance there may be significant penalties. You don’t add 5-10% margin in your calculations in the educational system, because you must state a precise value to get a higher grade from the instructor. In practical design however, if you design to the “gnat’s ass,” things will not work out perfectly and that gnat will turn around and bite you somewhere. You will have to get a feel for what sorts of design margins are appropriate in different settings to find that fine balance between adding excessive cost and running the risk of shortfalls.

So in comparing those three values (75,000 – 69,000 – 73,000 gpm) an interpretation might be that the plant was probably originally capable of something perhaps 5% higher (78,750 gpm). The 69,000 gpm calculation is not perfect and not as accurate as the 73,000 gpm test result, but it might provide a clue deserving more investigation. The test result is likely more accurate considering the better instrumentation, and is close to the design value, but there still may be some opportunity for improvement, given our knowledge about what sort of margins might have been originally present. You might want to spend more time looking at historical trends and more data before giving up on the project, unless you had more pressing tasks.

Instrument uncertainties and varying design margins mean that new engineers are faced with an uncomfortable pivot immediately after graduation. Your work of the past twenty years has been judged (harshly?) on your ability to calculate the precise answer. Step out of that into an operating facility, and you are faced with a slew of errant or misleading readings that you may have difficulty interpreting unless you are well versed in statistical perspectives.

Rather than Isaac Newton and his clean system of deterministic calculus, your approach might need to become more Hercule Poirot-ish, where you must collect data clues from a host of less reliable sources and run some statistical analysis to build a case. Plus, mustache.

Summary

These are just a few of the ways people routinely consider statistical principles (or fail to) in engineering settings. Do get skilled at calculating precise answers based on formulae; that’s an appropriate objective of formal education. Then enhance those skills with a respect for the application of statistical techniques that can modify precise numbers into an interpretable distribution of potential values for your more realistic consideration. Acquire a respect for statistics; it’s yet another differentiator that will benefit you in the long run.