Conclusion.

When you want to have your cake and eat it too, you're in a dilemma. When you want to continue a high-level technology without further gutting Mother Earth for resources, you have to move your industrial base out of this world.

 

Industrial processes depend upon the availability of both raw materi­als and energy. Happily, the space environment has both. Since at first it will be far too expensive to bring raw materials up out of the terrest­rial gravity well of 11.2 km/sec, we will have to obtain them from shallower gravity wells. Initially, this could well be the Moon with only a 2.4 km/sec gravity well. However, it is quite likely that the prime source of raw materials in the Solar System will be the plan­etoid belt, and there has been con­siderable discussion about this pos­sibility in these pages and elsewhere. Later on, and in spite of a gravity well of 61.0 km/sec, the planet Jupiter will be tapped for raw materials.

Note that the figure of difficulty for transporting raw materials around the Solar System is given in terms of velocity change, not dis­tance. And transit time has no bearing on the cost. The true cost of lobbing raw materials around in space is a function of the cost of converting chemical or solar energy into the kinetic energy of velocity. Even if we come up with a reac­tionless space drive, it will require converting energy. The big cost of cosmic mining operations is that of getting human beings out there and maintaining life support while they are directing operations. The raw material itself will be sent to the space factories by the cheapest pos­sible method. This means inter­planetary trajectories requiring the least amount of energy. Lob a load into the proper trajectory at regular intervals, and pretty soon you have an astronautic pipeline with loads coming into the space factory with great regularity. Once such a pipe­line is established, it makes little difference how long a given load stays in transit; all that matters is the constant delivery rate.

Even delivering raw materials from the Moon is not going to be an expensive thing. The Moon is a body with no atmosphere and with an escape velocity of 2.4 km/sec. We can use solar energy to gener­ate electricity to power an elec­tromagnetic lunar surface catapult capable of heaving quite sizable loads into space. Even energy losses in this device do not cost us. The only cost is depreciation of the cap­ital equipment and the expenditure of the steel cans used to house the raw material loads to permit them to be catapulted. This sort of thing should be old hat to most SF read­ers who have cracked the covers of Heinlein's "The Moon Is a Harsh Mistress" or any number of other stories in which a lunar catapult has appeared. We could produce the engineering drawings for such a device today.

No, the technical problems of shipping raw material around the Solar System are not formidable and the costs are reasonable. The technical problems lie elsewhere: with us, with the human beings who must be there to run things. We must be there because it takes too long at the speed of light for radio commands and telemetry to span the distances involved; remote control becomes highly impractical and it becomes cheaper to put people there.

At the moment, the technical problems involved in designing, building, and maintaining long-life space-going closed ecological life-support systems prevent us from immediately mining the planetoids or Jupiter. But we can look for­ward to the solution to this within the next twenty-five years.

As we have seen in Part One, there is also plenty of energy avail­able in the Solar System.

But there are interesting little en­gineering problems concerned with proper handling of energy in the space environment. We have not had to worry about these problems on the Earth's surface . . . yet. But we will have to before very many more years have passed, because the most important problem that we are facing on Earth—and will face in space—is one of heat bal­ance.

Heat balance? Why worry about it? Tap the city mains or the local river for cooling water. Build cool­ing towers to dump the excess heat into the local atmosphere. Dis­charge the hot coolant into the river or ocean. The atmosphere and the oceans are pretty good heat sinks . . . and they are big.

Our energy-use levels have, until recently, been low enough that en­gineers could take the easy way out (as they always do, quite properly so, by training). They simply dumped hot gases into the atmo­sphere and hot water into the rivers and oceans. They probably could have done it for a much longer time if they hadn't also dumped waste matter at the same time. People began to get upset when millions of fish went belly-up and when they discovered they couldn't see across the front yard because of smog and tears in their eyes. So in­dustrial engineers are becoming in­creasingly sensitive about heat bal­ance, efficiencies, and the like.

These are going to be very criti­cal in the space environment, but for a different reason.

There is a very, very large heat sink out there, and it will take more than we can put into it for centuries to come. But our little world, the space factory, is going to be a very small heat sink.

In a space manufacturing opera­tion, every little bitty calorie is go­ing to have to be watched. Every calorie coming in is going to have to be accounted for, it's going to have to be watched while it is there, and it is going to have to be disposed of on schedule. The only way to dump excess heat from a vehicle in space—other than by overboard dump of coolant, which must then be replaced—is by radi­ation.

Space factories using the high energy levels and densities available "out thar" will have to have lots of radiator area to discharge the inevitable waste heat into the sink of the space environment. In­sulation only delays the problem of dealing with waste heat, although engineers will use insulation to pro­vide heat "delay lines" in the space factory, perhaps to smooth out heat pulses or peak heat loads so that they can be more easily handled. Space industry heat radiators will also be much larger than required. Engineers simply refuse to elimi­nate a thing they call "safety fac­tor." They will deliberately over-design and over-build by a factor ranging from two to ten, depending upon the magnitude and serious­ness of the consequences if some­thing goes wrong. This does not mean that they have no confidence in their work. It is their statement of reality because they know good and well that sooner or later (a) somebody is going to goof, forget instructions, panic, or try to stretch the design, and/or (b) something in the system is going to malfunction. They've humorously tagged it "Fi­nagle's Law" and it states that "anything that can go wrong, will go wrong." To prevent a heat surge from turning the space factory into a white-hot glob, they will try to anticipate overloads by building the heat radiators larger than required or by making stand-bys available.

Yes, we are talking about very large structures and very large heat fluxes. But engineers are not bog­gled very often or for very long by big numbers. They are even more blasé about big numbers than as­tronomers. And they are working in a strain-free environment.

They'll handle the heat balance because they will have to. Even now, a number of industrial processes are being considered for space, and they require heat sources.

In space, there are three basic sources of heat energy:

1. Solar radiation

2. Nuclear reactors

3. Chemical reactions

Right off the bat, chemical reac­tions might seem to be ruled out. But this is because our minds are still earthbound and thinking only about fuel-oxidizer exothermic re­actions. Any sharp chemical engineer will advise you that there are other ways to get chemical heat. For example, be very careful not to burn yourself when you make up a solution of zinc chloride and water; the solution gets pretty hot. Fuel cells may also be considered in the category of chemical reactors, and these are already being used in space exploration. The big problem with chemical reaction heat sources seems to be the amount of matter required and the fact that there is waste matter left over . . . water in the case of oxy-hydrogen fuel cells. However, it might be possible to recycle the water from fuel cell op­eration by using solar radiation to break up the hydrogen monoxide molecule.

It would also seem obvious at first glance that the chief contender for heat energy would be solar radiation. After all, it's out there for the taking. Or is it? No, not al­ways for a space factory located in a 200-kilometer equatorial earth orbit; it ducks into the earth's shadow for a significant portion of its orbital period. Same problem as using solar radiation as a sole power source here on earth, except the day-night cycle is shorter in the 200-kilometer orbit. The answer, of course, is to put the space factory into a polar orbit; it is only inci­dental that more energy is required to put it there and to get to it from the surface and from deep space. Ruzic cryostats and factories on Luna are going to have to have an energy source to substitute for solar radiation every fourteen days. Obviously, there are earth orbits into which a space factory can be placed where there is constant solar radiation; these are far out and at a slight angle to the ecliptic so that they get neither terrestrial nor lu­nar shadow. Thus, except for close-in orbits, solar radiation remains as probably the best energy source . . . even though you have to go to the bother of keeping your solar radiation receivers pointed toward Sol.

On-board nuclear energy sources are small, compact, constant, and reliable. They are also damnably inefficient at this point in time with only faint hopes of improving the situation in the foreseeable future. However, because nuclear units have such excellent characteristics, engineers will apply their philos­ophy of "don't fight it, love it." If nuclear reactors produce lots of heat directly, use it directly instead of trying to convert it all into elec­tricity. Yes, nuclear reactors do also produce ionizing radiation that isn't exactly healthy to living organisms in large doses. But they have been powering submarines with them for well over a decade. Submariners have been living with nuclear reac­tors practically in their laps. It does not appear that the birth rate in New London, Connecticut has fallen in the slightest. Nor do the girls put away the pills when the U.S.S. Enterprise makes port.

Each of these three energy sources has its own advantages and disadvantages. We will be using all three of them in our Third Indus­trial Revolution space factories, each in specialized applications where their good points come out tops in the trade-off.

At the moment, the most effi­cient means for utilizing the output of any of these three sources ap­pears to be in the form of electricity. Electricity is easily generated by all three sources. Electricity is easy to handle, transmit, control, measure, switch, and use.

NASA has studied several prom­ising electrical heat sources for space industrial processes. These are:

1. Induction heating

2. Electron beam gun heating

3. Electron beam plasma gun heating

4. Laser

5. Electric arc

6. Electrical resistance heating

7. Ultrasonics

8. Microwave heating

Some of these can be used in a vacuum—and, in fact, must have a vacuum in order to work—while others require an atmosphere. Elec­tric arc heating is one of those that will not work in a vacuum and must be subsidized by an atmo­sphere in order to strike the arc.

You might think that a laser would be an excellent industrial heat source for space. Perhaps for some very specialized applications. But not generally. According to un­classified sources, the best effi­ciency attained thus far in high-power CO2 lasers is a roaring four­teen percent—which means that you've got to handle eighty-six per­cent waste heat somehow.

Heating by ultrasonics also falls into this category: poor efficiency. In addition, this process requires friction, the actual rubbing of ma­terials together. In the zero-g envi­ronment, this could well be a seri­ous drawback.

Microwave radiation heating has poor heat efficiency and poor weight efficiency. In addition, it's molecularly sensitive. Microwave ovens work because they excite the water molecule in foods . . . and other organic materials such as your hand if that happens to get in the beam. Yes, if you get enough beam power, a high enough energy density, you can heat lots of things . . . like vaporizing a ball of steel wool thrown into the beam of a megawatt missile tracking radar.

 

 

FIGURE 1. Electromagnetic "levi­tation" and heating of metals in zero-g can be accomplished by appli­cation of radio frequencies to coils surrounding the sample. Heating is accomplished by induction, and posi­tioning and handling is carried out by varying the r-f in the coils.

 

The electron beam gun turns out to be very good as a space indus­trial heating source in a vacuum. It has very good heat efficiency and high energy density. It can be fo­cused down to get things very hot quite locally.

But the electron beam plasma gun has no definite advantages over the simpler, ordinary electron beam gun. It's more complicated. But it may be used in special processes where specific beam particles are required.

Induction heating comes in with a score of seventy percent efficiency and, if the material is mag­netic, can be used to levitate, posi­tion, and move the material as well as heat or melt it. It is also possible to get very high temperature gradi­ents with induction heating. (See Figure 1.)

The very best heat source for space industrial processes in terms of thermal efficiency, weight effi­ciency, cleanliness, and adaptability to the space environment—espe­cially vacuum—is electrical resis­tance heating. The good old electric stove turns out to be the best heat­ing source for space industry!

Please note that none of the above heating methods involves the use of a flame created by the com­bustion of a fuel and an oxidizer. This is not because we would have to transport the fuel and oxidizer to the space industrial site. It is be­cause the weightless environment does not permit the existence of a flame as we know it.

The familiar candle flame in a one-g field is shown in Figure 2. It results from burning the paraffin hydrocarbon of the candle with the oxygen in the ambient air. It is a steady-state process once it is ini­tiated. Several events occur very rapidly in the flame region. Com­bustion specialists are still not ex­actly certain what happens when and where, and the simple flame turns out to be another common and apparently simple device, like the electrical transformer, that is very difficult to understand. It ap­pears that the solid material of the hydrocarbon is first melted, then vaporized by the heat radiated from the combustion zone; this ab­sorbs some of the heat energy of the system. These gases then react with the surrounding ambient oxy­gen, and the resulting oxidation re­action releases enough heat energy to drive the system and perpetuate this chemical chain reaction. Com­bustion product gases are formed by this reaction and include carbon dioxide, carbon monoxide, water vapor, and numerous minor com­bustion products ranging in com­plexity from hydrogen to large or­ganic molecules. Specialists in flame structure and combustion processes believe that many of these compounds, including the simpler ones, are formed through various stages that involve free atom and free radical production and reaction.

 

 

FIGURES 2 and 3. Behavior of a flame is greatly different in our nor­mal one-g environment (Figure 2) from that in the zero-g of space (Figure 3). On Earth, the heated gases from the flame are less dense, therefore rise and pull in fresh oxy­gen for the flame. In zero-g, there are no convection air currents formed, and the flame surrounds it­self with combustion products.

 

The resultant low density hot combustion end-product gases rise from the flame zone since they are less dense than the surrounding at­mosphere. This permits atmo­spheric oxygen to enter the flame zone to perpetuate the process. Al­though flame combustion is there­fore a very complex process with many intermediate steps, it is a steady-state process in which gas flow and fuel flow into the system can be equated with gas flow out plus heat energy . . . in a one-g field.

If the candle and flame are placed in a zero-g environment, the lower density combustion end-product gases do not rise away from the flame zone. They can't. In the absence of a gravity field, less dense gases cannot rise. Therefore, fresh atmospheric oxygen cannot move into the flame zone to per­petuate the combustion process.

An idealized zero-g flame shortly after ignition is shown in Figure 3. Some studies have already been made with zero-g flames in jet air­craft flying parabolic trajectories. High speed cine films have been made of these experiments. They show some interesting and, at first, confusing results.

Initially, the flame builds up to maximum size and brilliance very quickly. Then, just as quickly, it re­cedes and darkens. In actuality, the zero-g flame never achieves the full envelopment shown in the idealized sketch because the ignition method localizes the burning to a few spots. In addition, the combustion prod­ucts tended to subdue the flame be­fore the fuel-wick system could be enveloped in toto.

But' the zero-g flame process did not stop!

When acceleration returned at the end of the parabolic zero-g flight, the flame reappeared.

This is an example of the sort of totally unsuspected and highly se­rendipitous sort of thing that we may expect in our transition from earthbound conditions to those of space.

Naturally, this strange flame be­havior immediately invokes scien­tists to form hypotheses to explain what happened. One of these hy­potheses goes as follows:

The gas that is formed from the fuel by heat energy reacts chemically with the atmospheric oxygen. But, unlike the steady-state fuel-to-­air mixture ratio of a one-g flame, the zero-g flame experiences a con­stantly changing mixture ratio. It goes from very lean to very rich as the ambient oxygen originally sur­rounding it is chemically used up. The process occurs much faster than ordinary diffusion can replace the oxygen, and oxygen starvation occurs. The flame immediately be­gins to cool. The result is a blanket of fuel-rich flammable gas next to a molten fuel that is in turn cov­ered by a layer of both solid and gaseous combustion products in a fuel-rich state. These multifarious combustion products probably include free radicals and free atoms, plus a lot of very fuel-rich com­ponents . . . plus heat energy that has not been able to leave the party by convection currents of the combustion gases. The heat energy stays in the system, leaving only by conduction through the unburned fuel to the candle, in our example, or by radiation from the corona of unburned components.

The zero-g flame system can best be described at this point as dor­mant.

When convection is renewed by either acceleration of the system or by fan-induced air movement, the gases are cleared away, oxygen is provided to the starved system, the complex free radicals and free atoms begin to react, and the com­bustion process is renewed.

How long will a zero-g flame re­main dormant? The aircraft tests provided only enough weightless time for a maximum of twelve sec­onds "float" for the zero-g flame . . . which is plenty of time to study it, sample the gases and com­pounds within the dormant flame, and measure it six ways from Sun­day. The flames remained dormant for as long as the twelve-second float.

This is not only plenty of time to study a zero-g flame with modern high speed instrumentation, it is also plenty long enough for hypo­thetical, yet-to-be-devised industrial processes that will tap a dormant zero-g flame to obtain some very unusual chemical compounds ob­tainable in no other way!

We are going to run into other amazing and serendipitous discov­eries in the space environment. Ob­viously, because of our one-g men­tal orientation, we haven't suspected all the zero-g possibilities in reasonably common phenomena. Most of these discoveries will be quite elegant in their simplicity and will cause us to exclaim, "Why, of course! Why didn't we think of it before?"

The reason is simple: We have a distorted notion of the way the uni­verse works.

But we do not need to base our speculations about what space in­dustry can do on things that we might or might not discover to be possible once we are firmly en­trenched in the space environment. We do not need to say, "We must do it because we don't know what will be discovered until we do." The people who are interested in the possibility of space industry are doing their homework in advance. They have already identified a number of industrial processes that appear to be quite viable, unique and economical in space.

The table on page 105 shows in outline form some fourteen differ­ent generalized industrial processes amenable to the space environ­ment. These include some forty-four different subheads which are distinct processes themselves. Sy­nergistic combinations of these lead to an astronomical number of pos­sibilities. This means that we are going to be able to conduct a very diverse number of industrial opera­tions in space.

 

 

Some of these have been dis­cussed at length here and else­where. See, for example, Joseph Green's article on "Manufacturing in Space" in the December 1970 is­sue of Analog. A number of space industrial products, some obvious and some not, have also been dis­cussed. Since it is quite impractical here to cover the remaining ones on the chart that have not received attention, let's just look at a few of the more interesting processes and see what comes from a study of their possible uses. Mind you, the obvious products probably will not be the most ubiquitous or most profitable ones to come from any given process. And there are com­binations of processes that could produce some interesting products. Remember that we are standing in the shoes of a hypothetical Ben­jamin Franklin just after the open­ing gun has gone off for the First Industrial Revolution; could Ben or any other intelligent and informed person have forecast all the prod­ucts that would ensue?

 

 

FIGURE 4. Marangoni Flow is a surface tension phenomenon caused by the movement of a fluid surface from an area of high surface tension to one of low surface tension. Sur­face tension gradients can be created by a difference in temperature or concentration.

 

Also please remember that we are not just talking about the "early days" of space industry be­tween now and the turn of the century. We are looking at all the pos­sibilities with assumptions of deep space transportation rates being comparable to today's ocean-going shipping tariffs. We are also consid­ering using raw materials that we are going to find in space and on other planets with shallower gravity wells.

 

Separation and purification of materials takes on a wholly different aspect when carried out in the space environment. For example, electrophoresis involves the use of electric field gradients to separate macromolecules. It is a very useful industrial tool here on Earth for bi­ological production. But because of convection within the apparatus, we have not been able to widely use the simple, straightforward and highly efficient fluid electrophoresis. This has led to the development of paper electrophoresis, column elec­trophoresis, and electrochromatog­raphy; they are not as suitable for a number of reasons. In the space environment, the lack of convection cells created by density differences will permit the use of zone elec­trophoresis with liquids, which promises to produce exceedingly high quality biological materials.

The lack of convection due to density gradients in a fluid is an as­pect of zero-g, by the way, that is just beginning to draw attention to itself. Surface tension and other in­ternal forces are usually overpowered here on Earth by the pres­ence of the one-g gravity field. In the space environment, the as­cendancy of these formerly "weak" forces is a reality that can be used for industrial processing. There are two prime examples of surface ten­sion phenomena that are of inter­est.

Very few people have heard of Marangoni Flow, but many people have witnessed it. Marangoni Flow is fluid flow caused by surface ten­sion gradients. Surface tension itself is a function of both the tempera­ture and the chemical concentration of a fluid. If a free liquid surface experiences a surface tension gradi­ent because of temperature or con­centration differences, the fluid will flow along its surface from an area of low surface tension to an area of higher surface tension. (See Figure 4.) A common illustration of Marangoni Flow is the formation of "tear drops" in a wine glass. The evaporation of the alcohol in the wine leaves a cool layer of wine on the wetted side of the glass above the liquid level of the wine. This has a higher surface tension than the wine in the glass, so the liquid layer on the wetted wall draws up liquid from the bulk until a "tear" is formed. When the tear becomes too large and gravity forces again prevail, it runs back into the bulk of the wine in the glass.

Obviously, Marangoni Flow can be used to separate fluids of differ­ent surface tension, to pump fluids because of a difference in tempera­ture, or to produce fluid flow under vacuum conditions. It will be pre­dominant in the zero-g environ­ment of a space factory. I am not going to make the same mistake here that Arthur Clarke made in his famous article on communica­tions satellites; there are some patentable aspects to the use of Marangoni Flow in space industrial processes that I would not care to discuss for a couple of years in the public press, thank you.

 

 

FIGURE 5. Rayleigh Flow is the well-known earthbound formation of convection cells by density gradients where the rising low-density fluid lifts the liquid surface. But in zero-g the only convective heating cells that can form will be the result of Ben­ard Flow, a surface tension phenom­enon where liquid flows from areas of low surface tension to areas of high surface tension and flows faster on the surface than within the fluid. Thus, Benard Flow lifts the fluid surface above the cold portions of the convection cell because surface tension increases with a decrease in temperature. It doesn't look right, but that's our one-g mental dis­tortion at work!

 

Benard Flow is a variation of Marangoni Flow that is also a surface tension gradient phenomenon. It will produce convection cells in a film of liquid in the absence of a gravity field. Or even against a gravity field! If you have a thin layer of fluid and heat it on one side, cooling it on the other, you will establish cellular convective motion within the fluid indepen­dent of ordinary Rayleigh con­vection cells. Figure 5 shows the difference. Again, the fluid flows from an area of low surface tension to one of high surface tension, ex­cept that it occurs not on the sur­face of the fluid but within a thin film of fluid.

Don't get mentally blocked or problem-set with the idea that a fluid is something like a solution that exists at or near room tem­perature. Gases are also fluids. So is a melt of metal. In zero-g, it be­comes possible to use separation techniques as purification processes utilizing either density differences or surface tension differences. Some density difference separation tech­niques are as simple as spinning-up a glob of fluid in free suspension. (See Figure 6.)

All of this leads to very high-purity materials that are formed free of container contamination. And it becomes quite practical to produce very large quantities of these surer-pure materials. High-purity materials such as metals are usually very expensive now, mere laboratory curiosities as aluminum was in the early Nineteenth Cen­tury. Super-pure materials often have physical or chemical proper­ties quite different from those of a slightly-impure material. Beryllium is an example of this. When pro­duced on Earth from the double fluoride K2BeF4 by electrolysis, it is brittle and hard enough to scratch glass. When produced by vacuum distillation to get high purity, beryl­lium is quite ductile. What can be done with inexpensive, easily ob­tained, ductile beryllium? It's like asking a scientist in 1830 if he thought aircraft would be made of aluminum; both were impractical ideas at that time.

 

 

FIGURE 6. Degassing or density separation by the spin technique

 

In addition to making high-pur­ity materials in space, it will also be possible to make combinations of materials with highly-controlled alloying, doping, or contaminating. Alloy technology began with the first manufacture of bronze, a mix­ture of tin and copper . . . and it probably occurred by accident. Since then, all alloying has been basically the same: mix two or more molten metals together to achieve a homogenous combina­tion, then let the mix cool.

Sometimes this doesn't work too well with certain metals on Earth, primarily because of density differ­ences between the metals and often caused because their surface ten­sions are so different that they won't mix.

 

 

FIGURE 7. It is possible to obtain a uniform, homogenous mixture of materials of different densities in zero-g and maintain the uniformity. Astronauts will be able to keep their oil-and-vinegar salad dressing well-mixed continuously with one good shaking.

 

For example, gallium arsenide produced on Earth is a semiconductor used in light-emitting diodes, Gunn Effect avalanche diodes for microwave sources, and limited space charge accumulation devices. Bismuth is of the same pe­riod as arsenic but exhibits im­miscibility with gallium. Experts believe it could be produced as a homogenous alloy in the zero-g en­vironment; they do not know at this time precisely what its charac­teristics and properties might be, but they believe it might exhibit some interesting semiconductor properties.

The classic homogenous alloy process is not limited to metal sys­tems. Any binary or pseudobinary alloy system which exhibits a liquid immiscibility problem should be capable of forming a homogenous alloy under space environment con­ditions. Some unique materials are going to result from this. It's too early to say for certain, but they will probably be cermet-like mate­rials with several structural and electronic applications.

 

The space industrial environment is also going to make possible the first significant departure in alloy technology since the start of the Bronze Age. Most alloys have to be formed at high temperatures be­cause of miscibility problems, among other things. However, for years dentists have used an inter-metallic compound known as an amalgam, usually a mixture of solid silver and liquid mercury—a metal with a high melting tempera­ture coupled with a metal with low melting temperature. It is also pos­sible to form true homogenous al­loys of a combination of a low ­melting-point metal and a high­ melting-point metal and then "cur­ing" them at higher temperature. The big problem with these "thermo-setting" alloys in the Earth environment is the fact that they are very sensitive to mixture distri­bution homogeneity. But in the density-insensitive realm of zero-g, it will be possible to achieve and sustain exceedingly homogenous mixtures without concern that they will separate. (See Figure 7.) When we are able to make the first thermo-setting alloys in zero-g, we may be able to use them to form very interesting composite materials when combined with whiskers.

Whiskers themselves are going to become common for' composite materials because it will be much easier to make them in zero-g. One problem with whiskers is the in­ability to grow them to great lengths in a one-g field; they break off. In addition, whiskers are most commonly formed now by growing them around a thin tungsten wire; this complicates the metallurgy of whiskers tremendously, lowers their potential strength, and places a lower limit on how small a whisker can be made. In zero-g, we will be able to form whiskers without the tungsten wire center and will be able to form them with very long lengths.

In the zero-g environment, we will also be able to produce very large, very thin membranes by means of surface tension drawing with no substrate support. On Earth, the production of ultra-thin membranes requires the use of a substrate to support the membrane against gravity forces. The substrate limits the thinness to which a membrane can be drawn because, after you have formed the membrane on the substrate, you have to strip it off the substrate.

Obviously, if you do not have to support the membrane against gravity, you can form a very thin membrane of very large area. It is rather like blowing a gigantic soap bubble or forming a. very large soap film on a loop of wire. After you reach the size where the sur­face tension forces are overcome by the gravity forces, the membrane breaks. In zero-g, you can blow your soap bubble as large as you wish, deposit it on frames as shown in Figure 8, and then break the bubble in between the frames. This is but one method being discussed for formation of large thin mem­branes in zero-g.

Theoretically, it would he pos­sible to form a membrane the size of a football field (or larger) with a thickness of only a few molecules—or even of only one molecule with certain substances.

What are we going to do with these super membranes? What did we do with super-thin films?

 

Foams and cellular materials are rapidly becoming part of our in­dustrial and commercial product stables. Even foamed metals are available, but they are expensive because they are difficult to produce with repeatable high quality in our one-g gravity field. Again, this is due to density differences between the liquid or melt and the gas used to produce the foam voids or bubbles. Do this in zero-g and you will have uniform distribution of the voids throughout the mate­rial, and you will not have to rush the cooling of the material to pre­vent the bubbles from rising. They can't rise due to density differences. Low density foams exhibit high stiffness, even at high internal pres­sures; plastic foams have found wide use in packaging where stiff­ness is required. But their strength is moderate in comparison to the solid, nonfoamed material. We can improve this by making a whisker composite of the main material in zero-g before foaming it. The addi­tion of fibers or whiskers to foamed materials will result in very strong and light materials. We will even be able to beat Mother Nature at her own game in this field, because she originally developed or evolved the fiber-reinforced cellular mate­rial called "wood" that can be cut, shaped, sawed, drilled, machined, and fastened just like a plastic. Once we have zero-g space industry making such reinforced cellular materials, we will find all sorts of uses for them, believe me.

 

 

FIGURE 8. Production of very. large, thin, flat membranes by the bubble process in zero-g.

 

The low temperatures and high vacuum conditions of the space en­vironment are going to bring to maturity an entirely new area of basic chemistry: free atom chem­istry. And it's new. During 1971, Philip Skell and J. J. Havel of Pennsylvania State University announced the results of their prelim­inary work in this field. Studying the behavior of molecules and what happens between them—the prov­ince of chemistry until now—does not tell you very much about the behavior, characteristics, or proper­ties of the atoms which make up the molecules. Common table salt is a glaring example of this. And the behavior of a single hydrogen atom is quite different from that of the bi-atomic hydrogen molecule in all of its different spin states, me­tastable states, et cetera. It is like trying to study an individual by studying a crowd of people . . . or studying a man and wife together in an attempt to find out what each of them could do singly.

Free atom chemistry has not been an area of great activity because of the lack of suitable tech­niques for experimentation. To get free atoms, one usually boils them off a hot wire in a vacuum. There­fore, the temperature of the free atoms is very high indeed. It is not possible to react these free atoms at a temperature of several thousand degrees centigrade with compounds and organic molecules that decom­pose at a temperature of a few hundred degrees centigrade. Skell and Havel obtained their free atoms from boiling them off hot wires in a vacuum, sending them through a vacuum without colli­sion, and impacting them on sur­faces that had been supercooled. Thus, it was possible to study reac­tions with low-temperature mole­cules because, even when the su­perhot free atom hit, it did not raise the temperature of the target molecule above the decomposition point.

There have been some fascinat­ing results thus far from free atom chemistry. For example. Skell and Havel studied some reactions with free platinum atoms. Platinum is a "noble" metal with a high resis­tance to corrosion and capable of reacting only with the strongest oxi­dizing agents—such as aqua regia. This is one reason it is in demand for jewelry; the other reason is that it is damned hard to refine it be­cause when it does form com­pounds, it does so with lasting friendship, so to speak.

But, using the "boil-off plus high vacuum plus supercooled target" technique they developed, they found free platinum atoms to be highly reactive. They even got platinum atoms to react with the innocuous organic molecule, propy­lene. Thus far, they've worked with over forty metals and discovered other unsuspected reactivities. In­teractions and reactions took place on the supercooled target that had never been seen before.

Free atom chemistry holds the promise of entirely new classes of organo-metallic compounds with completely unsuspected properties. Work has only just begun, literally. It's all very experimental because the facilities are somewhat ex­pensive. They require high vacuum and very low temperatures.

But these characteristics are an integral part of the space industrial environment!

Free radical chemistry has also been held back by much the same sort of very expensive facilities and lack of technique. Again, we do not know what we can really expect from free radical chemistry. It was surveyed over a decade ago by the U.S. Air Force Office of Scientific Research, but there were no break­throughs. We do know that free radical chemical technology would be quite useful in rocket propul­sion, for example. For over twenty years, we've known that if a method could be found to produce and stabilize monatomic hydrogen (single-H) in a liquid state, it would make a rocket propellant with very high specific impulse. It has been estimated that single-H would have a specific impulse as high as 1,200 Newton-seconds-per-­kilogram, making it competitive with Nerva-type rocket engines. Letting single-H go back to double-H releases a tremendous amount of energy and gives a rocket engineer the very low molecular weight he desires for an exhaust gas. Single-H would also make an excellent en­ergy source for earthbound uses. The space environment appears to offer the characteristics needed to study and develop free radical chemistry. It may be that in space we will develop the single-H capa­bility to create super-cheap space transportation!

Producing these new materials with new chemistries in the space environment will certainly have the same, if not more, impact upon our culture as the development and use of plastics, semiconductors, and pe­trochemicals. It will require space factories to make them . . . and to make many of the other things, old and new, that we are now making on Earth with a growing problem of disposing of the wastes thus gen­erated.

Some people will say, "Who needs them?" Others will wonder, "Why bother going into space to establish more industry? Progress isn't measured by the quality or quantity of industrial products any longer!"

Others will say nothing because they know very well that without those industrial products we are back where we were before the First Industrial Revolution when one farmer, working as hard as he could, was able to feed only one family other than his own . . . and that was at the whim of fire, flood, insect pests, and drought. True, in an agricultural peasant economy, everybody has a little of something, but nobody has very much of any­thing. And fifty percent of the chil­dren born die before reaching the age of five.

And still others will say nothing, knowing that the telephone wasn't wanted before it was invented. They won't talk about it; they'll go ahead and do it. Where there is money to be made and a com­petitive advantage to be gained in the marketplace, there will always be people ready with the risk capi­tal, the technical know-how, and the ability to produce.

Time has a strange tendency to alter the perspective of people. A generation ago, there was a goal of getting into orbit—period—regardless of cost; today, getting into orbit is a commonplace thing that doesn't even make the newspapers. (How many people even noticed that Great Britain finally made it into the Space Club by orbiting their own satellite, Prospero, with their own indigenous launch vehicle, the Black Arrow, in October 1971?) Today, the major obstacle is the high cost of getting into the space environment; who is willing to bet that ten years from now cost will no longer be a problem? We will probably be worrying about having enough orbital capability!

We have lived thus far in a dis­torted part of the universe: the Earth's surface. We are moving out of this distorted, cramped, and vul­nerable chunk of space into the universe where we will be able to build, fabricate, manufacture, pro­duce, and create what we need bet­ter, cheaper, and in more quantity without further disturbing our home planet.

The Third Industrial Revolution will complete the work started by the First. It will free us forever from dependence upon our home planet and from its whims, quirks, famines, shortages, and lack of room. It will free forever human muscle power and human brain power so that we can pursue what­ever our real destiny is.