The sixth element

What’s new?

The magazine Additive Manufacturing reported on 14 August 2020 on the use of graphene in 3D printing, also called additive manufacturing. Additive manufacturing uses filaments of plastic sometimes with embedded with other materials, often fibers, to increase the strength or improve other properties of the printed object. In this case, the ability of graphene to carry heat means that the resulting object cools more uniformly, reducing the tendency of layers to separate after printing.

On 24 August, the company Paragraf reported in the online journal EE Times, that Paragraf ”has developed an innovative method of producing graphene at scale.”

What does it mean?

Graphene is an arrangement of carbon atoms in connected six sided shapes, as shown in the picture above. The arrangement is only one carbon thick, making graphene a two-dimensional sheet of material. In 2004, two researchers at the University of Manchester used sticky tape to peel off layers of carbon from graphite to create graphene. They shared the Nobel prize in physics in 2010 for their research on the material.

Graphene has useful properties. For example, it is very conductive, that is, electrons flow easily through graphene. This property is important in Hall effect sensors, which use the interaction of magnetism and electricity to detect position, for example, to detect the level of a floating magnet in a gas tank and thus the amount of gas in a car. Graphene is also extremely strong for its weight, it is transparent and flexible, and, as mentioned above, it conducts heat well.

Research on new materials holds great promise, usually research on composites of different elements. Graphene, carbon nanototubes, and buckminsterfullerenes (also called fullerenes or buckyballs, but not these dangerous small round magnets) are fascinating to me because they are just carbon, which is also the building block of all life on planet Earth (and, memorably, the signature of infestation in the 1979 movie Star Trek: The Motion Picture). Do an internet search on “my favorite element is carbon” and you will be rewarded with fascinating information about the element with atomic number 6. Not to mention carbon’s use in that old, old communication device, the pencil.

I should have known better, but I thought we would be farther along by now in practical applications of graphene. Graphene was hailed as a miraculous substance with great promise, with the 2010 Nobel Prize citation predicting that “a vast variety of practical applications now appear possible including the creation of new materials and the manufacture of innovative electronics.” A 2012 article in New Scientist was more cautiously optimistic: “Hundreds of applications have been suggested to take advantage of graphene’s remarkable properties. Some are more realistic than others, and difficulties remain to be overcome with all – but from computer chips to touchscreens, there are some promising ideas in the pipeline.”

Progress has been slow. However, even recognizing that graphene had been noticed before 2004 (the term graphene dates from 1961), the time scale of discovery, research, and application is typically very long. My favorite example is the fax, which was invented in 1843.

The Paragraf article that I cited at the start of this post discussed one always present barrier, difficulties in manufacturing: “These challenges mean there has been a lack of contamination-free, transfer-free, large-area graphene available in the market, and adoption in mass-market electronics remains slow. New solutions are clearly required if graphene is to make its mark in the electronics sector.” You cannot study a substance and, even more, you cannot apply it widely until the substance can be manufactured at a reasonable cost with high quality.  

What does it mean for you?

Progress in technology is often slower than we want but also sometimes slower than we perceive it to be – many an “overnight sensation” in any field is actually the result of years of hard work. Graphene is fitting into that usual story very well. Progress is happening, with Patentscope showing 64,126 worldwide patents containing the word graphene at the time I am writing this sentence. But the University of Manchester, the birthplace of graphene, even says: “There are many thousands of patents relating to graphene. Many of these are unlikely to become reality.”

One key is improvements in manufacturing; another is that graphene may be useful when added to other substances (as in the additive manufacturing article cited at the start of this post) or when traces of other substances are added to it. On 29 January 2020 New Scientist reported that the addition of guano, yes, guano, improves some properties of graphene. The New Scientist delights in the title of the paper it cites as the source of this information: “Will any crap we put into graphene increase its electrocatalytic effect?” Thus it seems that, again, alloys or composites of different materials are in our future. A great deal of our technological past can also be interpreted as learning about the importance of mixing substances, as in the development of steel from iron, which depended crucially on the percent of carbon in the recipe, and learning about the importance of specific manufacturing methods, such as temperatures and methods for heating and quenching steel.

A 2019 article in Digitaltrends lists potential applications of graphene in flexible electronics, solar cells/photovoltaics, semiconductors, water filtration, and mosquito defense, a list that that make graphene seem poised to solve several of the problems facing our world goals of sustainability. As with any new substance, we need to worry about potential safety hazards, and the answer to the question “is graphene safe?” seems to be that we don’t know yet if it is safe in all formulations and uses.  

You are likely to see new products that incorporate graphene, such as in the Hall effect sensors I described above, indeed, you may have already, but you are also likely to be unaware of the presence of graphene in those products. The history of technology is the discovery of new materials and then the painstaking and slow development of applications. I am eager to see what the future holds for graphene. Also, with graphite, diamonds, fullerenes, carbon nanotubes, and now graphene, I am eager to see what other tricks carbon has up its sleeve.

Where can you learn more?

Graphenea has a good technical description of the properties of graphene.

Wikipedia has a good list of potential applications of graphene.

Many state fairs are cancelled or scaled back this year. The Colorado State Fair has its eye on the important stuff with the Drive Through Fair Food event.

I made an app for that

What’s new?

JD Shadel of the BBC recently reported on the company AirDev. In 2015 entrepreneur Vladimir Leytus used Bubble, a drag-and-drop tool, to create a clone of Twitter; he accomplished this task in a week with no previous knowledge of how to program.  In 2020, he repeated the exercise, creating a new clone of Twitter, using updated tools, called no code tools. Leytus founded AirDev to help companies use these no code tools.

What does it mean?

Every piece of software that you use on your computer required a programmer – actually a legion of programmers – who wrote, line-by-line, the detailed instructions to tell the computer exactly what to do based on the input from you, the user. The history of coding includes great progress in making that programming task easier and easier, building up from assembly language, through Fortran and similar languages, to modern languages now in demand: Python, JavaScript, Java, C#, etc.  Important concepts have been developed and applied: interpreters and compilers, typing of variables, object oriented programming, data structures, algorithms, graphic interfaces, and more. A key idea is that programming languages build on top of other programming accomplishments. For example, one step in the progress of programming was the ability to call functions even as simple as print – print(‘Hello, world!’) – instead of having to tell the computer step by step how to print. To print the document I am writing now I can click on an icon labelled print, which will start a cascade of calls to other functions and capabilities, about which I need to know nothing. I can print! Just as anyone can print, as coding languages have improved, anyone can code, or so this article would have you believe.

I have a troubled relationship with coding and I probably should get over it. To me, “girls who code” (“we’re building the world’s largest pipeline of future female engineers”) smacks of “girls who make coffee.” During my long career as a woman in a man’s field, I have avoided taking on tasks that someone may be trying to use to demean me. Hard work on unpleasant tasks is fine with me, as long as we are all doing it, but don’t single me out to make the copies or make the coffee.

But coding has changed – a lot – since I learned Fortran in 1965 on an IBM 360. A task that used to be merely the tedious implementation of an engineer’s vision and design now may be integral to the design process itself. E. M. Forster’s statement “how can I know what I think until I see what I write” now translates to “how can I know what I design until I see what I code.” Certainly in mechatronic devices (mechanical plus electronics gives you mechatronics) the logic of the control part of the device is often the most difficult part of the design. Being able to make prototype programs and test them out quickly is the central part of the process of design in some products.

The ability to code may be the key that enables you to be the lead person on a new project. I am reminded of my father’s story about getting an assignment involving a trip to Geneva because he was the only one in the room with an up-to-date passport. The ability to code has become the ability to travel far and fast. 

My other issue with coding is that many people hear “technology” and think only of computer technology when there is so much more to the word; engineering and technology are about how the world works and must be solidly based in physics, not just computers. I perceive the word “engineer” as having been stretched in the phrase “computer engineer” to include people who do not have that fundamental knowledge of how things work (the BS in Computer Science from the Viterbi School of Engineering at the University of Southern California does not require any physics courses). You are not an engineer because you wrote a new app.

Thank you, I do feel better now.

But, despite my issues, no code coding is very cool. Opening up the ability to create an app (application) without needing to know how to code is a game changer. As the article says, “… early no-code adopters saw a more radical future in which anyone could make their own apps, and a movement that could redefine what it means to be a developer and diversify tech entrepreneurship.” Someone with an idea for a new app can create a prototype and test it out with ease and without spending a lot of money and time for the coding.

But here is another important quote: “Leytus compares the no-code trend to the emergence of PowerPoint, which mostly eliminated the need for in-house presentation designers since everyone could design their own.” I probably don’t need to spell out to you that the emergence of PowerPoint allowed some people to design perfectly awful presentations. I – and probably you too – wish that some people had had to work with a designer. Do an Internet search for “death by PowerPoint” to find examples. Probably similar arguments can be made about WordPress for webpage design (guilty, guilty, guilty) and other tools that enable people to do work that we used to have to pay for.

What does it mean for you?

I’ve written before on the enthusiastic and helpful response that led so many people to make face masks for COVID-19 protection, but I cautioned about the need to respect expertise. Arguments are still underway about the best type of fabric of how many layers and with what construction that should be used for a simple face mask. Caution is in order here also. You shouldn’t – and wouldn’t – let your niece code your computer security system using a no code system just as you wouldn’t let her set up a Ring system from Walmart to provide security for your warehouse. The no code movement doesn’t eliminate your need to use experts where expertise is crucial.

The no code movement empowers the individual entrepreneur. The ability to take your new idea and code it into a working prototype, on your own, in a reasonable amount of time, and at a reasonable cost opens possibilities for entrepreneurs everywhere, or least those with access to the required computer technology and Internet connection. But the fact that Leytus, who coded Twitter twice in no code tools, founded a company to help you use no code tools certainly brings some irony to this vision, suggesting that expertise still helps. The best argument for no code tools for me is to eliminate the need to translate what I am thinking for someone else. I’m back to “how can I know what I design until I see what I code.” An entrepreneur can develop the idea while coding, instead of having to try to explain a complete idea to the person paid to do the code. Tinkering is an important part of invention.

I am excited by the use of the tools in social movements, nonprofits, and community organizations. Many communities are struggling to match food opportunities with food needs, housing opportunities with housing needs, and so forth. While many will have said “there should be an app for that” now more people can say “I’ll write the app for that.”

Where can you learn more?

Wired has more information on how the no code movement reduces start up costs for entrepreneurs. KissFlow argues for using no code tools in the IT department of an organization because it speeds up coding even for experienced programmers, and KissFlow will help you with tools.  But Bob Reselman makes a good case for expertise instead of no code platforms.

The no code movement is new for me, so I used an Internet search to find some lists of no code platforms. Webflow offers their own product but also this list. Same for budibase, which has a product and a list. Cenario doesn’t have a no code product and their list is the longest I found. The folks at App Development Cost made a list of the “top 12 native, open-source, hybrid, and rapid mobile app development tools.” They also offer a tool to help you estimate the cost of developing an app.

Different no code tools allow different types of functionality; you are limited by the types of tools programmed into the no code tools. The best tool to create a new game will not be the best tool for a new business app. BettyBlocks claims their integration with programming languages gives the best of coding and no code worlds. They also are clear about the types of applications that can be built on BettyBlocks.

Some argue that no code is still coding; just with a much friendlier interface. Others argue that even using a microwave requires a form of programming. And others that so much that programmers do is routine and no code tools make all that work much easier.  You can see some recent debate on this page at Quora.

This image has an empty alt attribute; its file name is image-1.png

This work is licensed under a Creative Commons Attribution 4.0 International License.

August 2020 meeting of Pueblo Makes

At the 18 August 2020 meeting, we heard from four makers about making with metal.

Catie Blickhahn, of Elysian Evrimata,  (,,, @elysianevrimata) showed us jewelry she makes with lost wax casting, resin casting, stone cutting, and other methods.

She sells her work at Steel City Art Works Gallery (216 South Union,, She has been experimenting with making videos of the making process in order to attract buyers.  For her honors minor at CSU-Pueblo, she is studying how to market as artists in an ever changing landscape. She is also doing a marketing minor.

Ryan Gardner, of Ryan Gardner Designs ( @ryangardnerdesigns) is also a co-owner of the Center for Metal Arts (625 South Union Avenue, The center helps people at all levels of skills from those who have never done metal work through professional. They are planning a soft opening in November for a gallery in the front part of building.  

He showed us his own bench (above) and the large collection of shared equipment for lapidary and metal working. They teach classes and workshops in many different techniques. They just added an AV system that will allow for live streaming and online videos as well. They do have open studio times when people learning to make jewelry can have access to all equipment. They can also set up one-on-one sessions. Their goals are to “have fun and share what we do.” Four artists are in the studio full time and Ryan showed some of his work.

The above piece is optical quartz, carved on the back and inlaid with gold leaf; the stone is amethyst and the circle is oxidized sterling silver. Michael Boyd, one of the other owners, is known for his stone work. Ryan said that classes are offered on various schedules, including someone the evening, although COVID has meant that classes have been postponed. Most classes are appropriate for beginners; classes and workshops needed more knowledge are clearly labeled.

Jeff Madeen owns Bloback Gallery (131 Spring Street, makes art work from metal and other material, including found objects.

The above photograph shows a natural casting from a forest fire.

Jeff just finished the above piece after 9 years.  Several shows are ongoing at BloBack Gallery of the work of other artists. He also showed us larger pieces that are on the sidewalk outside of the gallery, including a piece titled Pueblo DNA and an 8-inch howitzer, cut and welded into a peace symbol. He said he can work from a set design (using Vectorworks on the computer) but finds that boring. “I’d rather not know where I’m going to end up.” He is mostly at BloBack Gallery 10 to 5 every day but Monday, but sometimes runs errands.

Ryan McWilliams, owner of Johnny’s Metal Works and Boiler Shop (303 South Santa Fe, They have made some objects at Watertower Place using repurposed materials such as the chain on this staircase below.

Johnny’s specializes in doing the hard things that others can’t do. They have a wide variety of equipment, such as a big press brake, plate rollers, big shears, a water jet (photo below), and automated saws.

They do tasks that are out of the norm, anything you can think of in heavy iron. They have a lot of customers with two pieces that are supposed to be one piece. They recently did work for a multi billion dollar company, also recently repaired a titanium wheelchair, and are currently doing a project for a group of artists in Colorado Springs.

Cathy Valenzuela, who owns Tuxedo Ranch, (, custom promotional material) described that Johnny’s also makes artistic things, such as the steel model of the light tower made for Urban Authority to commemorate the expansion of convention center (see photo below).

Dave Pump gave a shout out for Johnny’s for help in putting together the kinetic sculpture at Project Inspire, the new venture of Pueblo Diversified Industries (  

Paula Robben offered the services of SBDC ( to all the makers, and some had already used them or planned to.

After the presentation we heard updates from others in the group. Emily Gradisar announced that Ticktock Pueblo ( @ticktockpueblo) is moving to a new, bigger location, next door to Bistoro at Central Plaza, and will offer more and larger work and maker spaces, with five or six spaces on the ground floor and more downstairs. There will be no commission on sales, only a flat rent.

Amanda Corum (Executive Director Pueblo Corporate College, 719-549-3163, said activity has been slow at PCC due to COVID. She announced that a state training program is again available. She and her staff can help with filling out the application to get funds to train existing and new staff; the training can be done by the community college, by a third party, or as internal training.

Kayci Barnett, Giodone Branch Manager (, said they will expand hours in next month, allowing for longer computer sessions. They are continuing curb side pick up and crafts to go for the kids.  Sharon Rice, makerspace librarian at Rawlings ( said they have had some requests for 3D printing, but the space is closed now because furniture is stored there. In September there will be kits available for pick up.  

Paula suggested making a directory of Pueblo Makers and good discussion followed. Many makers want to help others. Jane will work with Zach on a way to add such listings at

Pueblo Makes meets the third Tuesday of each month by zoom, 3:30-5 pm. The next meeting will be September 15. The link is always Please send your comments and suggestions about Pueblo Makes to

Testing, 1, 2, 3.

Analog Multimeter

What’s new?

In the 22 July 2020 issue of New Scientist, Anna Demming reports on uses of analog (British: analogue) computing in an article titled “Why old school technology could shape the future of digital computing.”

What does it mean?

If you want to find the shortest route from Los Angeles to Boston, you can program a computer to use the simplex algorithm to solve the linear programming formulation of the problem; the data are the distances between all possible intermediary cities. Or you can, as proposed in 1957 by George Minty (Operations Research 5, page 724),

“Build a string model of the travel network, where knots represent cities and string lengths represent distances (or costs). Seize the knot ‘Los Angeles’ in your left hand and the knot ‘Boston’ in your right and pull them apart. If the model becomes entangled, have an assistant untie and re-tie knots until the entanglement is resolved. Eventually one or more paths will stretch tight – they then are alternative shortest routes.”

The first method of solution is described as “digital,” because the computer computes the solution using numbers, or digits. The second is described as “analog” because it uses a physical analogy to find the solution.

In engineering, we use models – mathematical and physical – to describe the world. We analyze those models in order to elicit the implications of the models and then we interpret those results back to the real world system. Analysis has increasingly taken the form of computation using digital computers, starting with the abacus and continuing through supercomputers.

In my field of operations research, researchers have developed powerful methods to optimize systems, that is, to find a solution that minimizes cost or maximizes benefit. Scheduling, production planning, and network design, as examples, have all improved from our ability to express these situations as mathematical models and use computers to find the optimal (or at least a very good) solution.

In many fields of engineering, the models take the form of differential equations, that is, equations written in the language of calculus in which the change in one quantity depends on the level of another quantity. For example, the rate of movement of a mass attached at the end of a spring is a function of the distance the spring is extended or compressed. The rate at which tea cools depends on the difference in temperature between the tea and the surrounding temperature. The essence of engineering is the application of differential equations.

One strength of digital computers is that they are general purpose machines that can be programmed to solve many different problems. The string and knots used to solve the Los Angeles to Boston problem has to be rebuilt to solve the Paris to Moscow problem, and can only solve shortest path problems, while the digital computer, using the same program and algorithm, can solve the problem of any two cities, given the data of distances between intermediary cities. And that digital computer can be programmed to solve other problems.

But digital computing has drawbacks. Digital computing uses numbers, always truncated. Digital computation is an approximation because computers use 0 or 1 – and no numbers between 0 and 1 – to represent all numbers. You can approximate the numerical quantity pi as closely as you want in a computer, but it is still an approximation because pi is an infinite decimal that never repeats; digital computation truncates pi at some number of digits and numbers in digital computers are always truncated from the true number in the real world. The sound wave that varies continuously to produce the sensation of music in your ear and your head is represented very closely – but not exactly – by the digits in digital audio. The speedometer showing your car’s speed on a dial is a continuous (analog) representation of speed while the digital readout on newer cars is an approximation.

Differential equations are continuous but are represented as difference equations in the computer, that is, time varies in discrete jumps not continuously. With an appropriate computer, the loss in accuracy is negligible, but always there. Also, a digital computer, unless designed with multiple processors to allow parallel computation, does one computation as a time.

But since many engineering models use differential equations, and since some differential equations recur again and again in physical models, some analog computers can be general purpose. An analog computer mimics the physical system it is modeling. As a result, unlike the digital computer that must laboriously compute the trajectory at each tiny step in time, an analog computer continuously follows the path that is the analogy of the physical system being modeled. As stated in the New Scientist article: “Quantities like electric current, charge and capacitance are related by rates of change in their values. This means they fit differential equations, allowing electrical circuits to serve as analogues for all other systems governed by such mathematical expressions.” Computing with beams of light or radiation at other wavelengths can simulate other systems of interest, such as earthquakes and stock market behavior.  Other devices called memristors (resisters with memory) can simulate brain activity. Analog computing has problems of course, including laborious programming and lack of accuracy (for reasons different from digital computing).

What does it mean for you?

Maybe not much. If you are an audiophile, you may have a strong opinion about analog or digital music. My father, a telephony engineer, hated to talk on a cell phone because of its awful sound fidelity. Some photographers prefer using film over digital cameras. I know that a car’s digital display of 38.1 mph or even 38 mpg is actually enough information for me as a driver, but I find it easier to quickly read and interpret an analog dial display of speed. And I learned to tell time on a clock face, not a digital display. I can more easily tell if I am speeding or if I am late for an appointment by glancing at an analog display rather than digital display. And I am amused by the digital simulation of an analog clock face.

Computation has been so successful in improving our lives that we sometimes forget that nature does not compute. A tennis ball does not compute the arc to follow after it leaves the racket; a swallow does not fly by computing the necessary muscle movements; your brain does not think by computing with numbers.

It may make a difference to you in the future if analog or hybrid computation, because of its inherently parallel nature, makes feasible the rapid solution of problems now intractable on digital computers. At a minimum, the use of analog devices in some widespread applications may slow the pace of increased energy consumption by our many devices.

Where can you learn more?

For keeping up on trends in science and engineering, no magazine beats New Scientist. I have been reading this British magazine since 1970 when I spent part of my junior year at the University of Glasgow in Scotland.

An article at IEEE Spectrum by one of the researchers in this area, Yannis Tsividis, has more description of analog computing. Other useful articles are by Bernd Ulmann, Bill Shweber (the slide rule, which engineers used to put a human on the moon, is an analog device), and Lou Frenzel. Ulmann has a fascinating blog devoted to analog computing.

Beliefs and actions

What’s new?

Researchers at NIST have developed synthetic COVID-19 material which can be made with specific concentrations of the disease and which can be safely handled. This material can be used to determine the sensitivity of tests for COVID-19: how well does the test detect the presence of COVID-19 at different concentrations?

What does it mean?

For many years I taught university courses on probability and decision analysis. The mathematics behind COVID-19 testing is, of course, Bayes’s theorem, and that piece of mathematics is an insightful way of looking at medical diagnosis and public health decision making. Here is an example.

If a physician thinks a patient has a particular disease and gives a test to the patient, three numbers are necessary to determine the meaning of the test outcome for that patient. Let’s assume that, based on observations of the patient’s symptoms and before having the test administered, the physican thinks there is a 75% chance the patient has the disease; that is the first number. The test itself is described by two other numbers. The sensitivity of a test is the probability the test gives a positive result (that is, says the patient does have the disease) when the patient actually has the disease; we want that number to be high and let’s assume our test has a sensitivity of 95%. Specificity is the probability the test gives a negative result when the patient actually does not have the disease; we want that number to be high and let’s assume our test has a specificity of 99%.

Think of sensitivity and specificity by considering a person you suspect to have green/red color blindness. How well that person detects the colors is described by the person’s ability to say “green” when presented with an object that is actually green and the person’s ability to say “red” when presented with an object that is actually red. How well does the person detect the real color? How well does the test detect the true health status of the patient? Note that a person who always says “green” (whether shown a green or red object) and a medical test that always says “disease” (whether the disease is actually present or not) has perfect sensitivity (100%) but perfectly awful specificity (0%). A good test has high numbers for both sensitivity and specificity.

In my example, I write three statements shown in the three bullets below, using the notation “disease” (in quote marks) to indicate that the test says the patient has the disease. Note that the event “disease” (the test says the patient has the disease) is not the same as the event disease (the patient actually has the disease).

  • P(disease) = 0.75, which implies that P(no disease) =0.25
  • Sensitivity: P(“disease”|disease) = 0.95, which implies that P(“no disease”|disease) = 0.05. The first notation is read as the probability the test says the patient has the disease (“disease”), given the patient does actually have the disease (disease).
  • Specificity: P(“no disease”|no disease) = 0.99 which implies that P(“disease”|no disease) = 0.01

The calculation using Bayes’s theorem is:

Thus, if the test says the patient has the disease, the physician is now almost sure (99.7%) that the patient has the disease. A similar calculation shows that if the test says the patient does not have the disease, the physician is then almost sure (99.0%) that the patient does not have the disease.

If 10,000 patients like this patient take the test, the results, on average, are:

 “Disease”“No disease”Totals
No disease2524752500

Of the 10,000, 7125 will be correctly identified as having the disease, while 25 will receive a false positive, that is, they will be told they have the disease when they actually don’t. Another 2475 will be correctly told they do not have the disease, but 375 will receive a false negative, that is they will be told they don’t have the disease when the actually do have it.

I love math. I want to show you many more examples (with graphs!) because Bayes’s theorem is an amazing piece of math. But more than my love of the math itself, I love the usefulness of math so I will move on to the useful implications.

I have used Bayes’s theorem to model the thinking of immunohematologists (how do the results of blood tests revise beliefs concerning the antibodies present in a patient’s blood), geologists (how do test results revise beliefs concerning the likelihood of seismic activity in an area proposed for storage of nuclear waste), and insurance agent managers (how does the behavior of insurance agents revise beliefs concerning their loyalty to the insurance company).

However beautiful and useful the math in revising beliefs, the most important perspective to remember is that beliefs drive actions. What blood should the immunohematologist recommend for transfusion into the patient, where should nuclear waste be stored, and what action should a company take concerning an agent’s disloyalty?

With COVID-19 tests, action is complicated. Because of the lag in getting test results in many parts of the country and because of poor or unknown specificity and sensitivity of COVID-19 test results, a person cannot use the results to guide immediate action, nor can contact tracers use the results to know whose contacts to trace, nor can policy makers use the results to decide what public health measures should be taken.

What does it mean for you?

Bayes’s theorem is a way to think about revising one’s beliefs. One starts with prior beliefs, that is, initial beliefs about the presence of some factor (the disease or the market attractiveness of a new product) based on current information. Then, in the face of new information (a medical test or a test of the new product in one market), one needs to revise one’s beliefs.

That revision should be based on the ability of that test to detect the factor: if the factor really is present, can the test detect it and if the factor really is not present, can the test detect that? Is the test market a good indication of whether the product will be successful in other markets? The test is evaluated by its ability to detect reality. We want a test with 100% sensitivity and 100% specificity but, unfortunately, with some diseases (for example, Alzheimer’s disease), the only perfect test is an autopsy. I used to live in Columbus, Ohio, which, I was told, was an excellent test market for fast food items: if a product was a success there, it would succeed nationally and if it failed there, it would fail nationally.   

Whenever there is uncertainty (that is, all the time), you won’t always be right. Any test has false positives and false negatives. That statement means the quality of a decision can only be judged by whether it was a good decision at the time it was made, not by whether it had good outcomes or not.

Beliefs drive actions, but a decision about actions must also consider the possible consequences of actions. With a different set of three numbers, the Bayes’s theorem calculation may leave a physician unsure whether or not the patient has a disease, but the physician may choose to administer a treatment anyway if early treatment is known to have an excellent outcome while lack of treatment often results in death.

One action leads to a series of actions. In decision analysis, we use a decision tree to represent a series of chance events and decision points into the future. A physician may try a treatment and then, depending on the patient’s results, continue that treatment or try a different one. Taking one action now may leave available later actions as options; a different action now may close off other later actions.

I taught an entire course (actually two entire courses) on decision analysis, But this piece is already longer than my usual writing. I would love to go on and on, but I will stop here: decision analysis is a very useful framework with many insights to offer decision makers.

Where can you learn more?

Regarding the formation of possessives, I come down on the side of Bayes’s theorem not Bayes’ but that dispute rages on.

Several calculators for Bayes’s theorem are available: here, here, and here. This one allows for more than two hypotheses (disease and no disease) and more than two test outcomes (“disease” and “no disease”).

Bayesian methods are increasingly used in sophisticated AI methods; see this example in machine learning.

Decision analysis software can help in modeling decisions. TreeAge is my favorite.

There are many books that serve as good introductions to decision analysis: Handbook of Decisions Analysis,  Introduction to Decision Analysis by David C Skinner, and Choices by Michael D. Resnik. My favorite is still the 1968 classic by Howard Raiffa.

The American Academy of Family Physicians provides this overview of COVID-19 testing. This article discusses the sensitivity and specificity of COVID-19 tests for COVID-19 antibodies (not for the COVID-19 virus).

Your electric future

Source: Wikimedia

What’s new?

Colorado’s Public Utility Commission and other Colorado government agencies have opened discussions with the Colorado companies that provide natural gas service about future plans for large natural gas infrastructure, as described by Allen Best in Mountain Town News this week.

What does it mean?

Mr Best explains that gas infrastructure is expensive and long lasting. The discussions will be based on the long term effects of large decisions, looking out 10 to 20 years. Limiting new installations of natural gas facilities will help the state meet ambitious goals to reduce emissions of greenhouse gases; a 2019 Colorado law requires the regulatory agencies to act to support these goals.

He places this action in the context of a movement called beneficial electrification. As explained by the Environmental and Energy Study Institute: “Beneficial electrification (or strategic electrification) is a term for replacing direct fossil fuel use (e.g., propane, heating oil, gasoline) with electricity in a way that reduces overall emissions and energy costs. There are many opportunities across the residential and commercial sectors. This can include switching to an electric vehicle or an electric heating system – as long as the end-user and the environment both benefit.”

The long term goal of moving completely away from fossil fuels to renewable energy increasingly looks like electrification because electricity can be generated economically from renewable sources such as solar and wind. While the COVID-19 crisis has slowed the electrification trend somewhat, the US Energy Information Agency and analysis by the financial asset management firm Lazard both state that renewable energy has won the cost battle. “In other words, it is now cheaper to save the climate than to destroy it.”

I am proud to point out the huge contribution of engineers in this achievement. The trend is driven by improvements in the devices and physical systems, improvements in the manufacture of those items, and improvements in the use of those items. For example, wind power has improved with better designs for wind towers, turbines, and all their components; more efficient manufacturing of all of those items leading to lower costs; and better designs for operating electrical grids to maximize the use of the wind when it is available. Furthermore, electrified devices can incorporate sophisticated controls that optimize their operation to provide high user satisfaction and low energy use. My two favorite engineering specialties, industrial engineering and mechatronics, deserve a lot of the credit. Industrial engineering drives the improvement in manufacturing and mechatronics drives the use of controls of devices and systems.

The National Academy of Engineering cites electrification as the greatest engineering achievement of the 20th century, and the book Networks of Power: Electrification in Western Society, 1880-1930, by Thomas P. Hughes, makes the case for the huge societal changes that ensued. David Nye’s Electrifying America: Social Meanings of a New Technology emphasizes even more the social impact of this technology. The current second wave of electrification is likely to have equally large impacts.

Much work remains to be done.  As skeptics point out, the sun does not always shine and the wind does not always blow, so storage technology is essential for this electrified future. The cost, range, and usefulness of electric vehicles will depend crucially on battery technology and, thank you again to the engineers, that technology is improving at a rapid pace. It is not “if,” it is “when” electric vehicles will be cheaper than vehicles with internal combustion engines. Other technologies are also on the way to or have already achieved economic feasibility: utility scale energy storage, efficient grid management, and advanced building design. My favorite not-so-crazy idea is the increasing use of direct current, rather than alternating current (yes, I think Edison was right, not Tesla).

Regulatory issues, the topic where I started this article, remain important, especially concerning the financial incentives felt by consumer of electricity.

What does it mean for you?

Consumer uses tend to dominate the discussion of beneficial electrification, including home heating, water heating, electric vehicles, and so forth. Some of those technologies apply in business settings and increasingly your offices and other buildings will rely on electricity.

Industrial uses present more of a challenge, especially processes that require large surges of power for short time periods. But if the steel mill in my town can move to solar power, other industrial processes can and will do the same. The mill has already transitioned to using only recycled steel and now is leading in the electrification trend. Electric forklifts, induction heating, and efficient heat pumps are already here.

Where can you learn more?

This March 2018 report from Lawrence Berkeley National Lab lays out the potentials for electrification, with an emphasis on the industrial and commercial sectors. Saul Griffith is a power voice advocating for the electric future and he is creating companies that move us there. McKinsey & Company argue that companies should be factoring electrification into all of their capital spending plans.