Wednesday, December 15, 2021

Secret story of EKIP, the Soviet flying saucer 

Parts 1-4

Image : EKIP Aviation Concern / Авиатехник

VALIUS VENCKUNAS

At one point in history there was an aircraft that could take off and land without an airport, despite being the size of a Boeing 747 jumbo jet. It used eight times less fuel and was twice cheaper than any conventional airplane. It was also completely immune to crashes, and looked like something from 80s sci-fi movies.

It was the EKIP: the Russian flying saucer. The transport of the future that never was.

What a future that could have been, though. If one was to believe everything written about the project – a difficult task, as half of the information contradicts the other half – the EKIP is one of those technologies that could have revolutionized everything on the planet. Think nuclear fusion reactor, plus EmDrive, multiplied by Burnelli lifting fuselage. Yes, it goes that deep.

The actual facts are difficult to discern, lost in the mist of time, propaganda, advertisements, media sensationalism, and untamed excitement of some of the world’s craziest conspiracy theorists. But at least part of the truth can be siphoned from it, revealing a gripping story of (alleged) inventiveness, (alleged) espionage and (alleged) corruption.

The beginning

The story starts, and for the most part ends with engineer Lev Nikolayevich Schukin. If his contemporaries and associates are to be believed, the man – till his last days in 2001 – believed in the uniqueness and the immense potential of inventions he proposed.

Schukin started his career at the Energia design bureau, where he had a chance to work on the ill-fated N1, the super heavy rocket designed to rival NASA’s Saturn V. According to some accounts, he left after the collapse of the Soviet Moon program and the subsequent turbulent change of Energia’s leadership. According to others, he stayed there a little more, and played an important role in the Apollo-Soyuz mission.

Then there was a period of working on hovercraft, an experience which could have heavily influenced the upcoming EKIP. By the late 70s, Schukin started having his own ideas about a completely new type of transport, and offered them to the leadership. In 1979 he got permission to form a separate bureau for the development of his unique project.

The project was named “EKIP”, a very Soviet, but at the same time quite uncharacteristic acronym of “ecology and progress”. Ecology – as in, the new aircraft will be more environmentally friendly than anything before. Progress, as in it is the future.
The list

The whole point of Schukin's work was to include a whole slew of very innovative, very unproven ideas into one airframe, frog-leaping any competing developments of prospective vehicles.

The airplane – if such a name can be used at all – would be a variation of a flying wing, something that always was futuristic in itself, no matter the context. In theory, aircraft of this type can be more efficient than classic tube-and-wings designs, as they don’t have parts that generate drag without generating lift.

Flying wings are notoriously difficult to fly though, so, a new heavily computerized control system had to be devised. There is very little information on this development, but it alone was a significant part of EKIP’s innovativeness.

Conventional airplanes need a lot of infrastructure to function, hence, large and expensive airports have to be built and maintained. An aircraft of the future had to do away with that, landing on an air cushion, like a hovercraft. It would also mean an ability to land on water, on unprepared ground, and generally - on any unobstructed surface of suitable length.

Being a flying wing with relatively high drag, the aircraft would sacrifice some speed. But landing speed would be significantly slower too. As a result, landings would be safer, and there would be no need for long runways. A combination of slow landing speed and an air cushion would enable an aircraft to perform emergency landing pretty much anywhere, for all intents and purposes, making it crash-proof.

But air cushion mechanisms are heavy and complicated, and especially awkward to fit on a flying wing-type aircraft. The shape has to be suitable for the installation of a cushion, and if the internal cargo hold or passenger cabin is to be made spacious and comfortable (a prerequisite for the aircraft of the future), the airframe has to be rather bulky.

Bulkiness means a lot of drag. A contraption to solve that issue has to be devised, in the form of a boundary-layer control mechanism that controls how the air behaves on the aircraft’s back. In theory, that could prevent the formation of vortices that slow the vehicle down. In practice, developing a working boundary-layer control system requires calculations that computers from the 80s have nightmares about.

Oh, the fuel in our aircraft of the future can’t be regular too – the “EK” part of the name isn’t there just for show. A new jet engine has to be devised that could consume not only jet fuel, but hydrogen and natural gas too. It would even accept aquazine, a mix of water and waste products of the gas industry – a “wonder fuel” developed by Soviet chemists at the time, and a subject of many a conspiracy theory since then.

The fact that such an engine would be a very efficient high-bypass turbofan with noise reduction features looks rather tame in this context.

The gaps

From this list alone one can imagine the hopes put into the project. Schukin quickly gathered a team of able and loyal engineers, but the work was hard and slow. The sheer amount of cutting-edge developments made it an all-encompassing effort much of the Soviet aircraft industry could have worked on, and according to some accounts, it did. Supposedly, scientists from Central Aerohydrodynamic Institute (TsAGI), Energia, and half a dozen other Soviet constructor bureaus were attached.

Yet the situation is very unclear. In almost all accounts, there is a gap between 1982, when the first prototype was constructed, and the late 80s, when the work on the second prototype started. There is no documentation, no names, no data. All the footage of wind tunnel tests at TsAGI was filmed in the early 90s, so, there is a chance that the project spent quite a lot of time on hold.

Why? Various reasons might have contributed to that. It is possible that Schukin’s development was not taken very seriously by the leadership, for being either too unconventional, or too expensive, or not too realistic. Its breakthrough happened after the collapse of the Soviet Union, when the climate for radical ideas was perfect.

Whatever really happened, the capitalist makeover sunk almost all scientific developments from the Soviet era, but offered immense opportunities for those scientists who could adapt.

Schukin created the EKIP Aviation Concern, and started advertising to the West.



Image : EKIP Aviation Concern


How Russians tried to sell a flying saucer 
| EKIP Part 2

In the late 70s Soviet engineer Lev Nikolayevich Schukin came up with an idea for an aircraft of a new kind: a flying wing that would take-off from an air cushion, use a new kind of fuel, and would be incredibly efficient due to an innovative boundary-layer control system.

A breakthrough idea, a case of suppressed technology, or just a baseless legend? This is the beginning of the story of the EKIP - Russia's flying saucer.

By the late 80s, Schiukin’s team was, allegedly, composed of the crème of the crop of engineers from Sukhoi, Ilyushin, Tupolev, and Energia bureaus, as well as the TsAGI institute and several other elite research institutions. They were based in Nizhny Novgorod, at the testing facility of Sokol aviation plant – the birthplace of many top-of-the-line Soviet military jets, from the MiG-15 to the MiG-31.

A small-scale technology demonstrator, called the L-1, was built there. Its wobbly runs on the facility's airfield featured early experiments on the boundary layer control system – the key feature which, according to later promotional material, passed the test with flying colors. Supposedly, all the calculations on the efficiency of the aircraft were proven, and there was no doubt that the full-scale vehicle would surpass conventional airplanes in every aspect. In 1989, in accordance with changing times, the EKIP bureau was reorganized into a firm.

The model never took off, though. In the winter of 1990, during a routine test run, the L-1 veered off the frozen runway and spectacularly crashed into a pile of snow – an episode, for some bizarre reason, featured in the later promotional material. Nobody was injured, but supposedly, the project was deemed dangerous.

Plant’s management did not hesitate and kicked the team out of the premises, a strange move that underscored both the lack of support Schiukin had and the growing decentralization of the Soviet system. A personal initiative could make or break a project now, on a much larger scale than ever before.


EKIP L-1 smashing into a pile of snow. (Image: EKIP Aviation Concern)


New winds

The firm found a new home at the Saratov aviation plant, by the personal invitation Schiukin received from the plant’s head. A less glamorous, but at the same time less militarized and decidedly less strictly controlled facility, the place became the scene for some of the wildest legends that surround the project to this day.

The move to Saratov signified the start of the golden age of the EKIP, even as the Soviet Union crumbled all around the vehicle. In one of the plant's 90 hangars, amongst towering airframes of unfinished Yakovlev Yak-42 airliners, the team began working on the second prototype – the L-2. It was still remotely-controlled and had conventional landing gear, but was supposed to be flyable and featured a provision for an air cushion.

Anatoly Savitsky, an up-and-coming businessman with an academic background, became Schiukin’s right hand. He understood the commercial potential EKIP had in the new climate which veered to capitalism more and more.

Savitsky managed to attract some investors of yet-unseen scale: Alexander Mikhailovich Mass, one of the first oligarchs of the emerging Russian gas industry, offered $1,5 million and a cheesy motto for the company – “On the wings of the dream we will fly to the bright future of the humanity.”

Schiukin’s connections brought in Nikolay Kuznetsov, the founder of the famous Kuznetsov bureau responsible for many Soviet jet engines. He promised to personally oversee the creation of EKIP’s wonder turbofan, which was supposed to have an ability to consume a wide range of different fuels and have a thermal efficiency of over 50% – at the time when most commercial jet engines barely reached 30%.

By 1993, Russia’s governmental Committee on the Issues of the North and the Ministry of Agriculture were onboard, promising solid sums of money. The project became a personal favorite of Oleg Lobov, the secretary of the presidential Security Council of Russia, and – reportedly – received a large backing from the military as a whole. Even the president Boris Yeltsin, supposedly, was following the development with great interest.

According to Savitsky, up to 600 people worked on the project at that time. The L-2 took off in 1992 and was shown at the Moscow air show the same year. In 1993, it was exhibited at the Paris air show and the Russian government promised to invest a further 1.2 billion rubles.

New troubles


Mass’ investment was the only money the team actually got its hands on. It opened the door which was hitherto closed: the one to wind tunnel testing.

The problem was, TsAGI – the leading aeronautical institute with the best equipment on this side of the now-fallen Iron curtain – did not want to have anything to do with the project. According to Savitsky, several of TsAGI’s scientists worked on the EKIP and the rest viewed the development with contempt.

The reason for that is unclear. In several contemporary interviews, Savitsky presents it as a sign of backwardness and irrelevancy of the people belonging to the freshly-collapsed Soviet system. The other possibility is a bit more likely – that academia at large simply viewed EKIP’s designers as charlatans, due to their fantastical claims and a lack of actual evidence.

Nevertheless, at the time the institute did not have the luxury to choose its sources of income. While the original L-2 crashed during a flight test, a second model, the L-2B, was constructed and tested in TsAGI’s wind and water tunnels. The model was equipped with floats instead of wheels or an air cushion, a feature present in most of its subsequent iterations, and the one that gave the vehicle a large part of its distinct, futuristic look.

None of several manufactured EKIPs had a planned air cushion attached and working, despite many of them carrying auxiliary engines installed for this purpose. Custom-built air cushion equipment was expensive and every ruble invested in the project either had to be spent elsewhere or never actually reached the engineers.

That was because of rather tragic circumstances the firm found itself in. The Soviet Union and the Soviet economy fell apart, giving way to the Russian Federation with the new capitalist system. But that system quickly got a habit of falling apart itself. The economic “shock therapy” brought hyperinflation, halved Russia’s GDP and opened doors to ever-present corruption. Whatever money the government or investors were delegating to the cause simply disappeared on its way to recipients, depreciating in value and trickling into the pockets of intermediaries.

By the mid-90s, the situation became desperate. Without the pay, the team was melting. Schukin was constantly sending letters to the government, the military, and Russia’s aeronautical firms asking them to save the project. But according to later interviews, time and time again he became a witness of the same dark comedy: an assurance of the full support, a promise to consider sending the inventor some money, and not a single ruble being actually sent. The largest fiasco of this kind happened in 1999 when the funding for the project was actually included in the governmental budget, but nothing resulted from that.

Schukin ended up investing much of his personal possessions into the project, hoping to keep the shrunken team afloat for a little while. The only hope now could come from beyond Russia’s borders.

New hopes


The firm tried attracting Western investment even before the collapse of the Soviet Union. Its very first promotional reel, likely made in 1991 and complete with an eerie synth soundtrack, retro-futuristic computer graphics, and delightfully broken English, invited “the Soviet and foreign firms for cooperation”.

It was also the year when foreign investors started frequenting the Saratov plant, inspecting models and promising to return with what designers hoped to be suitcases full of Western currency. Supposedly, between then and the early 2000s, 255 investors considered working with the firm.

In later years, members of the EKIP team made a habit of providing a lot of lists in their interviews. One of those lists was that of fantastical qualities of the vehicle. Another one was of countries that were supposedly interested in the development: from the UAE to Germany and from China to Argentina. Such geography was supposed to contrast the “world-wide fame” of the project to the utter disinterest it was receiving in Russia, and maybe, just maybe, stir some national pride in some rich oligarch.

For this reason, it is unclear how many foreign investors were actually interested. Although some of those 255 may have visited the Saratov plant with an appetite for opportunities, most of them, likely, were barely anything more than simple tourists.

There was one with serious intentions among them, though. In 2002, United States Senator Curt Weldon came to Saratov. He was a co-founder of the Duma-Senate Study group, an international committee intended to bolster cooperation between the former rivals of the Cold War.

EKIP’s team was deeply impressed by the visit and the visitor, reportedly, was impressed by the EKIP. A year later, the Saratov aviation plant signed an agreement of understanding with the Naval Air Systems Command (NAVAIR) – US Navy’s airborne weapons material support office. It is quite symptomatic that in many interviews with the Russian media, EKIP’s engineers presented NAVAIR as an American company looking to invest in Saratov.

On the American side, cooperation was headed by Dr. John Fischer, NAVAIR's director of research and engineering sciences. He oversaw the signing of the formal contract in 2004; According to the contract, the Saratov plant would produce a 230 kg (500 lb) prototype which would then be tested at the Naval Air Station Patuxent River. The testing was scheduled for 2007.

There are differing accounts of what happened next though. The most likely version is that in 2005 NAVAIR informed the EKIP concern that it was no longer interested due to unknown reasons. AeroTime News tried contacting the agency on this question but did not receive an answer.

Russian version of the events differs.

New legends


Members of the EKIP team, on numerous occasions, told the media that Americans actually built the prototype themselves. They spent millions of dollars, Boeing was somehow involved, yet all they managed to construct was just a barely flyable knock-off of the Russian original. It crashed on the maiden flight, and to save face, NAVAIR decided to sweep the development under the rug.

There is also a version where Schukin himself either visited the US to work on their prototype before his death in 2001 or was invited and declined the offer. Both Boeing and Airbus supposedly tried to recruit Schukin too, yet the inventor refused to sell out to Westerners, wanting EKIP to kick-start Russia’s economic recovery instead. In some of those accounts, foreign companies were about to start manufacturing copies of EKIP in the very near future; in others, they were unable to reproduce the Russian engineering miracle.

These legends might seem absurd, being an integral part of many discussions of the lost Soviet glory in Russia. Yet they are a direct result of the marketing strategy the company had adopted.

The legendary status of the vehicle – the extremely low cost, the incredibly high efficiency, the yet-unheard flight characteristics – originate from interviews, brochures, and videos, the EKIP aviation concern produced in the 90s and early 2000s. Most of them were aimed directly at the Western audience.

Sometime in the late 90s, the Discovery channel filmed an episode on the project; it was featured on ABC’s “Beyond 2000” too. Technological miracles happening among the rusting hulks of abandoned Soviet airplanes was an attractive story. The more fantastical were the figures that engineers could provide to journalists, the more attention they got. Such a tactic should have led to investments, but did not really work, and in some cases - such as with TsAGI’s scepticism - even backfired.

To this day it is impossible to discern what part of the EKIP’s legendary status was created purely for promotion and what part was actually substantiated. The initial project Schukin worked on in the 80s may have not even had a sci-fi angle, it may have been a simple idea to create a vehicle with a boundary layer control system, which later got overgrown with a dozen of cutting-edge features and resulted in this vehicle-of-the-future, perfectly adapted for the sensation-hungry climate of the 90s Russian capitalism.

After the failure of the cooperation with NAVAIR, several Russian journalists visited the Saratov plant and the hangar where the EKIP L2-3, the largest of constructed prototypes, stood. They described the sorry state of the project: a team of a couple of dozens of elderly scientists, led by “the chief of the saucers” Valery Sorokin; a cold place full of incredible inventions falling apart due to neglect; a bright Soviet future buried by corruption, indifference, and greed of both domestic and foreign officials and businessmen.

The worse the state of the project got, the more there was food for legends. Separating them from the real situation became even more difficult. But despite the mythical status, there are some hard facts and some glimpses of the future EKIP might bring upon us.


Image : EKIP Aviation Concern

What is left of the Russian flying saucer? | EKIP Part 3



EKIP was a project to develop an aircraft with some wondrous properties: a saucer-like flying wing that would be more efficient than any contemporary airplane. After the collapse of the Soviet Union, a company created by chief engineer Lev Nikolayevich Schukin struggled to procure funds to develop the idea. After Schiukin’s death and several unsuccessful attempts to attract Western investors, the company disappeared.

Reportedly, in the late 90s and early 2000s, fields and swamps around Saratov became infested with flying saucers. People who went fishing or hiking there got so used to seeing unexplainable phenomena in the sky, nobody even considered such events as unusual.

At the same time, guests of Saratov aviation plant were gladly shown the cause of those legends: incredible inventions of the plant’s engineers. Suitcase-sized saucer-like vehicles that could take-off vertically, hover above the ground, accelerate and change direction almost instantaneously, and do all of that almost without a sound.

Supposedly, those were models of the EKIP. Nobody saw them performing such feats firsthand, but every journalist reported that somebody – other journalists, plant’s workers, potential investors – could not hide their excitement at what those vehicles could do.

In reality, as opposed to countless sensationalized descriptions in the contemporary press, EKIP aircraft were neither able of vertical take-off, nor could hover; they might have been test-flown several times in the mid-to-late 90s, but it is doubtful that tests happened beyond the premises of the factory. So, it is far more likely that Saratov’s UFO legends were inspired by actual extraterrestrial activity than the EKIP.

Besides local legends, the EKIP team was responsible for half-a-dozen models of various sizes, as well as several lines of concepts for prospective vehicles based on the idea. Cataloging all of them is a difficult, but fascinating task.
Material legacy

Of the physical artefacts left by the project, the most impressive are the scale models and prototypes that actually got constructed. The majority of them were intended for display or ground tests. At least a couple – the L1 and the L2 – flew. There was also a prototype of the EKIP-AULA L2-3 drone, one of the last products the EKIP Aviation Concern tried to sell. The drone may or may not have performed some tests, but its main use was to be photoshopped onto impressive vistas, supposedly flying over green forests and clean lakes of Russia.

It was not the one to which the whole name of the project was applied, though. In a lot of material on the development, a large metallic aircraft with a wingspan of 14.4 meters and a Russian coat of arms on its front is presented as the crowning achievement of engineer Schiukin. Most people referred to it simply as “the EKIP”, although it had an actual name – the L2-3.

It was intended as a prototype of an actual production variant, representing the smallest of passenger aircraft the concern planned to manufacture. The aircraft was planned to be remotely-controlled though, maybe due to safety concerns and maybe because the team thought it would be cheaper.


The EKIP L2-3 prototype (Image: EKIP Aviation Concern)

The work on it started in 1993. In several interviews, team members claimed that two L2-3s were actually under construction, one for ground tests and one capable of flying. Yet, later they got combined into one, with parts of the second prototype supposedly manufactured, but never seen in any of the promotional photos.

Those parts were produced by Schiukin’s old colleagues from the Energia bureau and shipped to Saratov for assembly. The near-finished L2-3 became a backdrop for many videos, photo-ops, and articles on the EKIP written in subsequent decades. It was constantly “several months” from being finished, although there are some claims that by 2003 the work on it had actually been complete. It was the same year the US Navy got interested in the project, and demonstrating the flight of the large prototype would have been the perfect thing the team could have done to receive funding. Nevertheless, the L2-3 never took off.

It had some sort of computerized control system installed, as well as supposedly working air cushion of a not-yet-tested kind. That cushion, as well as the boundary layer control system, was supposed to be powered by Pratt & Whitney Canada PW206 helicopter engine, while the main jets the aircraft would use for flight were a pair of PW 305A turbofans, the same ones that power the Bombardier Learjet 60 business jet. If the technical specification for the L2-3 is correct, the aircraft, equipped in such a way, would be quite overpowered.

It is difficult to tell if the engines installed on it were working or not. After the company went bankrupt and the team dispersed, the prototype was not disassembled for parts, like many others in Russia. It may indicate that it was well protected or that the parts mounted on it simply were not that valuable.

The EKIP L2-3 stood in the empty hangar of the Saratov aviation plant until its closure in 2011. Then, it was moved to a museum of village Ivanovskoe, near Moscow. Most likely, it stands there to this day, reminding of the shape that – many hoped – would eventually be seen in the skies above the country.

But actual production models of the EKIP would not have looked like that.

Downsizing the dream

Concepts of EKIP commercial vehicles went through several iterations, fluctuating in size as the project progressed. Names of the models were constantly recycled ‒ a decision that makes understanding the scope of the company's work rather difficult.

The first plan was conceived by 1991 and laid out in the first promotional video. Saying that the ambition was grand would be an understatement.

Three models were envisioned: the L-2, the L-3, and the L-4, with takeoff weights of 7, 110, and 600 tons, payload capacities of 2, 50, and 250 tons, and passenger capacities of 18, 500, and 2600 respectively. Yes, the largest EKIP was planned to be capable of carrying 2600 passengers.


A snapshot of a promotional video, likely made in 1991. Three models with their cargo and passenger capacities are listed. (Image: EKIP Aviation Concern)

For comparison, the Antonov An-225 Mriya has a maximum takeoff weight of 640 tons and holds a world record of carrying almost 190 tons of payload. The Airbus A380, the largest passenger aircraft in the world, usually carries no more than 575 people. It is difficult to say whether Schiukin’s team was taking the idea of building a 2600-seat airliner seriously, but before giving this development some thought, let’s briefly discuss what happened later.

By 1993, the team started talking of another selection of models: EKIP now had to come in five flavors, from 9-ton L2-3 to 600-ton L4-2. The latter had the same weight as 1991’s L-4, but now had a 128-meter wingspan (slightly larger than the current holder of the World record, the Scaled Composites Stratolaunch) and passenger capacity of 2000. Its cargo capacity dropped to “modest” 200 tons.

As far as it is possible to tell, this set of proposals was kept up until the early 2000s, with some insignificant updates. In either 2000 or 2001, the company presented their competitor to the Airbus A3XX (before it got its proper name – A380). The new model was named L3-2DM and seemingly was a middle ground between earlier L3-2 and L4-1 models, with passenger capacity of 656.

A couple of years later, most likely in 2002, information on the third set of models was published on one of the early iterations of the company's website. The L-4 and its variants were dropped, the L3-2 became the heaviest one, with the weight of 360 tons and passenger capacity of “only” 1200.

This last outlook of the possible EKIP family remained unchanged for the rest of the project's life. In several subsequent years, it was updated with a couple of new proposals, intended to reflect the change of times.

One of them was the EKIP-AULA L2-3, a 350-kilogram model, finished by already wavering company in 2003 and possibly tested on water. The team later described it as a prototype of a reconnaissance and patrol drone, probably intended to be equipped with cameras. Proposal for a heavier drone, simply named EKIP-2, was published a year later, with its passenger variant EKIP-2P capable of carrying 2 people.

So, in a span of several years, the team went from proposing airliners the size of a city, to surveillance drones barely able to lift several kilograms of equipment. Meanwhile, almost two decades after the project's death, it is not uncommon to see mentions that the EKIP is going to be a competitor to the Boeing 747, the Airbus A350, or whatever large aircraft is popular at the time. Sure, those proposals existed, but they were just a fraction of EKIP’s prospective development. And there are serious reasons to think that it was not a well-thought-out one.

The giants


If built, EKIP L3s and L4s would have been some of the largest aircraft ever constructed. Reportedly, some of them were supposed to have three decks, but most likely the team never went as far as modeling the aircraft beyond the basic shape.

Several scale models of 1993-version of the EKIP L3-1, with 80 red seats in a semi-transparent plastic fuselage, offer the only glimpse into how this type of vehicle could have looked. If Schukin or his colleagues ever made blueprints of other models, they are, most likely, lost.


Scale model of 1993 variant of EKIP L3-1 (Image: EKIP Aviation Concern)

It is not difficult to see some logic in their designs though, with clear divisions between regional, mid-sized, and intercontinental jets. It seems, dreaming big was the team's specialty, as every time after coming up with a new set of models, they could not help but include some ridiculously-sized ones.

The ones with the capacity of over 1000 passengers nowadays look nothing but absurd, but we have to remember that those were the early 90s. The Airbus A380 has just entered development and superjumbos seemed like the next big step in civilian aviation. In ex-Soviet countries, every major engineering bureau had plans to develop one, from the innovative Sukhoi KR-860 to the 1200-seat Tupolev Tu-404. The EKIP L-4 and L4-2 would have overshadowed all of them though.

How realistic were these proposals? If we discard the most ridiculous claims made by journalists and other enthusiasts at the time of aircraft’s development and look at numbers presented by the EKIP team at the very end of its existence, some of the data starts looking quite possible. The L3-1, as described in 2002, would supposedly be able to fly 4,000 kilometers on 14 tons of fuel while carrying 160 passengers. Similarly sized Embraer E195-E2 regional jet, while carrying up to 146 passengers, has a range of 4,900 kilometers and carries 13.7 tons of fuel. Payload capacity and weight of the EKIP L3-1 are also somewhat similar to the E195-E2.


Just for the fun of it, here is a comparison of the largest EKIP proposal
 (E4-2 from 1993) with some real aircraft. (Image: AeroTime News)

On the other hand, with capacities of their heaviest models, the EKIP team strayed straight into fantasy. In most cases they seemed to calculate the theoretical payload capacity of an aircraft and divide it by roughly 100 kilograms to get the passenger capacity. With smaller planes such a calculation is somewhat close to reality. With larger ones it does not work.

The Boeing 747-400 has a passenger capacity of 416, while the 747-400F freighter maximum payload capacity is less than 125 tons. The Airbus A380 can carry between 575 and 853 passengers depending on the configuration and its freighter version, while never built, was intended to carry no more than 150 tons.

If the formula EKIP’s engineers applied to their aircraft worked, the 747-400 and the A380 would be able to carry 1250 and 1500 passengers respectively, but that is not the case. Mainly because passenger cabins require seating, heating, toilets, entertainment systems, and other amenities without which people would not agree to enter the aircraft.

The weight adds up, especially on long-haul aircraft. The fact that Schukin’s team did not consider those requirements shows that they did not reach the phase of actually designing most of their proposed aircraft and presented numbers based purely on calculations. And even with those, the wondrous properties of their aircraft were somewhat on par with early-21st century jets.

But to have those properties – with the bulky fuselage, heavy air cushion system, and other features not present on sleek modern jets – the main requirement had to be met: the incredible increase in efficiency, provided by the boundary layer control system.

Its properties are yet another mystery we have to unravel. In fact, Schukin and his colleagues were not the first and not the last ones to research this concept – some aviation enthusiasts attempted to implement it several decades earlier, and in 2005, after the EKIP aviation concern practically ceased its existence, some European scientists began quite a monumental project to study the phenomenon.

Would the Russian flying saucer actually work? | EKIP Part 4



Image : EKIP patent picture


Through the 80s and 90s, Lev Nikolayevich Schukin and his team of engineers tried to develop and sell an aircraft with, according to its developers, some quite remarkable properties. They did not have a success, although in subsequent years, the claims about their creation proliferated.

With all due respect to engineers who poured their hearts to make their project attractive, we have to look at it critically. Examining the claims made by the team behind the EKIP and finding out if they are true is a difficult, but, in fact, possible task.

On the surface, those claims were, as explained in the first part of this series, fairly simple. The aircraft will be roughly in the shape of a flying wing. It will have an air cushion instead of a landing gear. It will have a sophisticated computerized control system to overcome the inherent instability of the airframe. It will have a turbofan engine that is incredibly efficient and can run on an array of different fuels. It will have a boundary layer control system that will prevent airflow separation and negate much of the drag.

As a result of low landing speed and an air cushion, the aircraft will be essentially crash-proof. As a result of its shape, boundary layer control system, and new engines, it will be much more efficient and much more ecology-friendly than regular aircraft. Its internal space will also be much larger than that of regular airplanes.

The efficiency claims vary widely, from being at least on par with early 21st century jets, to using 20 to 50 percent less fuel than any other aircraft, to forgoing conventional fuels altogether – running on water, aquazine, natural gas or some other substance, or at least using negligible (up to just 20% in comparison with regular airplanes) amounts of jet fuel.

Some of those claims are closely intertwined, others not so. Some of them are hugely exaggerated, others not so. Let’s unpack them one by one.
Engines, computers, and a cushion

Fantastic properties of the EKIP engines, as explained in previous parts, should be discarded as an exaggeration made at very late stages of the project, mostly due to the desperation of the project’s participants. After all, most factual descriptions of the EKIP list all of the models as using regular engines and attaining impressive, but not impossibly small fuel consumption with them.

An exception would be Kuznetsov's multi-fuel engines promised in the early 90s, but they never came and it is likely that, for most of the project’s life, nobody expected them to come. All the models, promoted in 1991, were described with regular, mass-produced engines, such as the Progress D-436. By 2001, the multi-fuel capability was completely dropped from published claims about the aircraft, although the EKIP was still mostly described as running on natural gas. The L2-1 large-scale prototype was equipped with regular Pratt & Whitney Canada PW305A turbofans though, and the models list from 2001 describes all EKIPs as equipped with conventional jet engines made by Pratt & Whitney or Ivchenko-Progress.

So, the exotic features of the aircraft propulsion were just an addition, not intertwined with its other properties. The exceptional safety, on the other hand, was intertwined. It was a feature of a low landing speed and an air cushion. The first component falls victim to the same argument as most of the super-safe aircraft ideas, like the Burnelli lifting fuselage: any kind of aircraft can be made to have low landing speed, purely by increasing its wing area. A tradeoff to that is lower speed and higher fuel consumption.

If such a sacrifice is accepted, the addition of an air cushion is a completely plausible idea. Starting from the 70s, there were a lot of experiments – both in the US and in the Soviet Union – of mounting air cushions on transport aircraft. While the experiments were successful, the idea was, for the most part, deemed uneconomic. If the efficiency of such an aircraft can be increased – for example, by having a dual-purpose auxiliary engine, or a shape more suitable for a cushion, or a design of a cushion that would be more aerodynamic – the idea could see a comeback. The EKIP air cushion was supposed to be partially foldable, and while it probably was never tested, nothing about it strikes as particularly unreasonable.

As for the computerized control system, such a thing is a given in all modern aircraft. The first flights of EKIP scale models were wobbly due to the system's imperfections. But by the mid-90s, fly-by-wire became a norm in civil aviation, enabling a whole avalanche of flying wing airliner ideas. If built, the EKIP would likely make full use of that.
 
The fuselage and the efficiency


Other claims about the EKIP are just one claim with a selection of positive consequences.

The aircraft is, for the most part, a thick flying wing. In this regard, it is similar to many other flying wing or blended-wing-body (BWB) designs proposed all over the world, including the Soviet Union, since the dawn of aviation.

But being rather thick, such a fuselage in itself would create a lot of drag in comparison with regular, streamlined aircraft. This would happen mostly thanks to one particular phenomenon.

As such a vehicle moves forward, the airflow all around it remains mostly attached to the surface. But after passing the tick of the body it starts peeling off, unable to stick to the surface for much longer. Downstream of the point of such separation, the pressure becomes very low, dragging the aircraft back. Essentially, the aircraft starts creating a pocket of low density behind it.

If one wants to prevent that from happening, a device to keep airflow attached all around the fuselage has to be devised. Boundary layer control system, proposed by Schukin, is exactly that. It swallows the boundary layer as it is about to separate, redirecting it into the engine air intake.

The suction is done through cavities with small, controlled vortices. With such vortices trapped in right places, the part of the drag that comes from boundary layer separation can be greatly reduced.

As a result, the EKIP could be more efficient than a flying wing without such a system. This efficiency would enable the use of other features: the air cushion, the spacious fuselage, the internally-placed engines, and so on. It would substantiate all the other claims such as safety and low fuel consumption.

The whole idea of the EKIP rests on that boundary layer control system. The question of whether it would really make the aircraft that much more efficient becomes pivotal.

While this question can be answered, the answer is not as straightforward as “yes” or “no”. It has much more to do with the aviation industry and its economics than many engineers – Schukin included – would like it to.
 
Kaspar wing


But before that, let’s make one thing clear. Schukin was not the first one who had the idea to use trapped vortices. The phenomenon was first discovered by a German aeronautical engineer Witold Kasper in the early 1970s. Kasper – a former Boeing employee and an avid gliding enthusiast – noticed that with his glider flying at particular angles of attack and wing mechanization locked in a particular configuration, the aircraft would seem to glide much better.

That was because, supposedly, vortices were created and getting trapped along the wings, reducing drag. Kasper would go on to construct and patent an aircraft that would make the best use of this phenomenon – the Kasper Wing.


Much to the disappointment of the inventor, it did not turn out to be a great success. Its story is murky and full of inconsistencies, and later researchers, working on Kasper's patent, were unable to replicate the effect. The engineer would go on to design several regular gliders later in his life. They retained the name, but the concept of trapped vortices was abandoned.

Schukin may have heard of Kasper’s invention or may have come up with his idea independently. But his use of trapped vortices was quite a bit different and much more sophisticated than Kasper’s, and most of all - it was not a goal in itself. It was just a way to enable a whole package of other innovative solutions.

But in an attempt to answer the question of whether it would have actually worked, let’s turn to people with some first-hand knowledge of the idea.

The experiments


There is a gap in the EKIP’s development, between the first experiments in 1983 and the resumption of the work in the late 80s. It is very difficult to tell what was happening with the project during those years – some say, it was worked on in secret, in conjunction with the Soviet military; others say that it was completely abandoned due to lack of interest.

Nevertheless, it is possible to tell that at least Schukin did not forget his creation: during this time he was a frequent visitor at different Soviet universities and research institutions, presenting the idea of the EKIP there.

During one visit to Moscow State University sometime in the 80s, his presentation was attended by Sergei Ivanovich Chernyshenko: a young researcher with a degree in fluid dynamics.

Currently, Chernyshenko is a Professor in the Department of Aeronautics at Imperial College London. He remembers talking to Schukin after the presentation, discussing the issue of how difficult it was to design landing gear for very large aircraft and that an air cushion could be a way to solve that. While many accounts describe Schukin’s decision to equip the EKIP with a cushion as an issue of poor airport infrastructure in Russia, Chernyshenko’s account gives another angle to that.

In subsequent years, despite conducting research in the same field, the paths of two scientists did not cross again. But Schukin’s research was constantly on Chernyshenko’s radar. In 2000, he left Russia and became a professor in the UK; in 2005, he took the position of the scientific coordinator at the VortexCell2050: a project which united aerodynamics specialists from half a dozen of European universities over questions Schukin’s team tried to answer a decade earlier.

The idea behind the VortexCell2050 was also indicative of its time. The early 2000s saw a resurrection of the flying wing: Airbus was in the thick of designing one, Boeing had just bought McDonnell Douglas and resumed work on their half-finished design.

In an effort to boost the European aviation sector, the European Commission funded a massive research project that could benefit some of those developments. Using trapped vortices for boundary layer control was one of the ways massive airliners of the future could be made more efficient, and Chernyshenko – one of leading aerodynamics specialists in Europe, with some indirect experience in similar projects – was the perfect man to lead it.

He, and the whole team behind the VortexCell2050, went to great lengths to acknowledge the contributions of both Kasper and Schukin to the idea. Nevertheless, they describe both the Kasper Wing and the EKIP as “controversial”: in both cases, the most significant findings were not published, and actual characteristics of aircraft were unknown.

To test the idea of trapped vortices, the project’s participants had to start almost from scratch, although at least some information about Schukin’s experiments was available from people who observed them firsthand.

The project ended in 2009, with a lot of experiments conducted, and a massive amount of data gathered on the phenomenon of trapped vortices. It showed that such a system works, and in theory it could make aircraft a lot more efficient. But in order to do that, some specific criteria have to be met.
A question of size

When AeroTime asked Chernyshenko for his opinion if the EKIP really had the potential to revolutionize aviation, his answer was sceptical - but not towards Schukin or his ideas.

“The advantages and disadvantages of EKIP are very much like the advantages and disadvantages of regular blended-wing-body aircraft. If you look at Boeing or Airbus BWB designs – EKIP is basically that,” Chernyshenko said.

And those aircraft have a big catch: if we want an efficient BWB design, we can’t make it small. An aerodynamically perfect aircraft with a small frontal profile will always be superior to unaerodynamic aircraft with a large profile. Yet, if we are building a large aircraft, we can’t make that profile small, because we have to achieve a certain level of structural integrity.

In other words, we can’t make large aircraft thin, at least without some impossibly strong materials. So, large aircraft have to be thick. In Chernyshenko’s words, if we are building big aircraft, we have to sacrifice aerodynamic perfection for structural perfection.

In this case, a large flying wing – with an entirety of its surface dedicated to generating lifting power – becomes more efficient than a regular tube-and-wing design, which has a fuselage that does not generate lift.

This is the reason why all flying wing-like projects from the 90s and the 2000s were massive. Aircraft that were supposed to come out of Airbus VELA, Mcdonnell Douglas BWB-1, and Boeing X-48 projects would have dwarfed the current generation of wide-body airliners.

The EKIP sits in line with that. The large L3 and L4 models, drafted in the early 90s, were – in quite a counterintuitive way – a result of pragmatic thinking of Schukin’s team. They may look strange and unrealistic from today’s perspective, but at the time it seemed like the aircraft will really keep getting bigger and bigger. And so, much of EKIP’s design was an answer to exactly that.

“This is at the core of the answer to the question, whether I believe in the efficiency of the EKIP. For the small aircraft – no. If the efficiency is interpreted as aerodynamic perfection – no, not at all. Because there are good reasons why small aircraft are built like they are. But when it comes to consumption of fuel for very big aircraft, it changes a lot,” explains Chernyshenko.

Those large aircraft are bound to be rather unaerodynamic, with a thick profile that provokes separation of the boundary layer of air. The boundary layer control system is an answer to that and the disadvantages that it brings – the weight and the expense of operating the system – become outweighed by its advantages. Meanwhile, for small aircraft, a much better solution for drag reduction is simply to make them more aerodynamic.
 
Counting liters


According to the promotional material, presented by the EKIP team in 2001, all of their aircraft – no matter big or small – would consume 1.5 liters of jet fuel per passenger per 100 kilometers. On one hand, this number seems optimistic, but within the boundaries of reason. The Boeing 737 MAX 8 consumes 2.1 liters per passenger per 100 km; Airbus A321neo – 2.4 liters.

Large aircraft consume more. The Boeing 747-400 burns 3.4 liters per passenger per 100 km; the Airbus A380 –3.3 liters. Even the latest generation of wide-body twinjets is not that much better: for both the Boeing 787 and the Airbus A350 this number sits between roughly 2.5 and 3 depending on the distance flown.

On the other hand, we have to remember that the EKIP team did not calculate their passenger capacities properly. In all cases, they just divided the cargo capacity by roughly 100 kilograms, not including the additional weight of passenger amenities. This is why the fuel consumption per passenger should be taken with a grain of salt and the maximum fuel capacity divided by maximum distance is a more reliable number.

As explained in the previous part, for smaller EKIPs this number corresponds with the current generation of regional jets, such as the Embraer E195-E2. But for bigger EKIPs, the story is different.

The EKIP L3-2, as described in 2001, would be roughly comparable to the Boeing 747. It would have a takeoff weight of 360 tons, cargo capacity of 120 tons, and a range of 5,000 kilometers. It was designed to have a capacity to carry 127 tons of fuel.

The Boeing 747-8F, the latest generation of the cargo-hauling Queen of the skies, has a capacity of 138 tons and a range of 7,630 kilometers with 181 tons of fuel onboard. That would give it 42 kilometers for each kilogram of nominally carried fuel. For The EKIP L3-2, this number is 39.

So, while the efficiency of small EKIPs is roughly comparable to contemporary jets, larger EKIPs are somewhat more efficient. Unfortunately, there is not enough data to compare the largest models – the E-4 and the E4-2 – although the trend would, quite likely, continue. Being roughly twice heavier and larger than the Boeing 747 or the Airbus A380, those aircraft would likely be somewhat less efficient than them, but in comparison with regular planes of their size, they would employ the boundary layer control system to the greatest effect.

Valley of Death


This feature of the EKIP is also at the core of the program's failure. As is well-known, small-scale prototypes were produced; they flew, yet the data on their flight characteristics were not published. It is quite likely their fuel efficiency would not be better than that of regular jets. There is a good chance the EKIP E2-1 prototype, if it ever took off, would be even less fuel-efficient than the Bombardier Learjet 60 whose engines it cannibalized – due to both larger weight and worse aerodynamic characteristics.

Maybe it could have found its market thanks to its short takeoff and landing capability and the air cushion, but it is quite certain nobody would buy the small- and mid-range EKIPs for their fuel efficiency.

To demonstrate the real benefits of their creation, Schukin’s team would have to build large aircraft. But those require large investments.

“The Valley of Death problem. It is easy to get a small sum of money to build a small prototype. It is relatively easy to get a large sum of money to build something very large. But to do that you have to get through the middle stage, and it is very difficult to get money for that,” Chernyshenko said.

The situation is an often talked-about one and has plagued many developments. Just recently, the new generation of lighter-than-air aircraft – airships – fell victim to it. Previously, it has killed many seemingly very promising technologies.

But even if the valley could have been crossed, the EKIP would have encountered yet another problem.

The economics

In the early 90s, when Schukin was outlining his EKIPs and their city-sized wingspans, large aircraft seemed like the way of the future. The era gave birth both to massive BWB design projects as well as regular superjumbos, such as the Lockheed Martin VLST, the Boeing NLA, and the Airbus UHCA. Airbus was the only company that went through with their idea, putting the A380 into production.

Others backed out; as it turned out, they were right: the market for superjumbos was small, unpredictable, and without much hope for profit. The A380 was far from the bestseller Airbus hoped it to be and is quite often classified as a failure.

This realization crept in in the late 2000s, when – a few years after the A380’s introduction – customers began canceling their orders one by one.

The VortexCell2050 project was built on the premise of researching technologies that would benefit even larger generations of aircraft – the hyperjumbos of the mid-21st century. It outlined directions for further research, but even before its end, it was quite obvious that a follow-up was not exactly expected.

“When our project was ending, I talked to people from Airbus, and there was that pessimistic feeling. They knew how to build large aircraft, but they had no intention of building them. I strongly suspect they were already realizing that those huge aircraft will have no market,” Chernyshenko explained.

Boundary layer control systems with vortex cells worked. We can’t know how well it worked for Schukin, but in general, with the right design, they could work. But to exploit them, one has to build a very large aircraft; and the world, at least currently, does not have a need for those.

This is the answer not only to the question of the EKIP’s wondrous properties but to all the conspiracy theories that surround the aircraft: it is not a case of suppressed technology, it is a case of technology which is not really needed at this point in history. Schukin’s patents expired in the early 2000s, and since the EKIP Aviation Concern no longer exists, nothing is preventing anybody – be it Boeing, Airbus, or a team of aviation enthusiasts working on the outskirts of their local airfield – from building another EKIP. There is simply no reason to do that.
 
A sliver of hope

But the aviation market, in its long term, is not exactly predictable. While the superjumbos were dead by the 2010s, and the COVID-19 crisis dealt a huge blow to all wide-body aircraft, we can’t say for sure what the situation will be in a decade, or two, or five.

There might one day actually be a reason to produce very large aircraft. A BWB design is the most efficient way such an aircraft can be built, and a thick flying wing design will eventually have to deal with boundary layer separation. There will be a need to solve this issue.

“I would not be surprised if big companies, like Boeing or Airbus, would eventually take the technology that is demonstrated to work – this boundary layer control system – and add it to their existing BWB model, to extend the limit of what it can do,” Chernyshenko said.

Of course, the result would most likely not look like the EKIP: the saucer-like shape of Schukin’s creation would probably give way to more streamlined designs, and with proper airport infrastructure and advancements in landing gear technology, there would be little sense in making the aircraft carry a heavy air cushion system.

Using some form of alternative fuel was one of the features of the EKIP, and there might be some merit to that idea. Both Boeing and Airbus are currently working on hydrogen-powered aircraft projects, and Chinese COMAC has started looking into it too. While a lot still has to be sorted out for liquid gas-powered airplanes to work, there is a chance flying wings of the future would resemble EKIP at least in this regard.

Let’s not forget the engines. Throughout the whole development, Schukin was talking of hyper-efficient turbofans. While they never became a feature of the flying saucer, the high-bypass evolution was quietly happening in the background. When the development of the EKIP started in the late 70s, the Soviet Union was largely stuck with low-bypass Kuznetsov NK-8s, and even in the early 80s, some EKIP’s descriptions list it as the main engine for the project. In comparison with it, the current generation of similar turbofans uses roughly one-third less fuel in cruise flight, a result which would have been on the verge of science fiction in the 80s.

The fabled safety of the EKIP is also much less relevant in the current climate. Despite the sharp pre-COVID rise of air travel, the amount of accidents is decreasing, and in the end, a lot of small, procedural changes to the manufacturing, maintenance, and the use of aircraft increased the safety record a lot more than radical and not profit-friendly efforts, such as the introduction of completely different forms of vehicle, could ever do.

So, while the EKIP did not succeed, some of the innovative ideas that composed the aircraft found their way to life. Others will likely find it in the very near future, and while the most central ones – primarily, the boundary layer control system – may have to wait for decades to become relevant again, the contribution of Schukin can still find its way into aviation.

Now, if only it would be possible to explain all of that to the proponents of suppressed technology conspiracy theories.
Are we really made of star stuff?



1. How did matter come together to make planets and life in the first place?
1.1. Are we really made of star stuff?

← Astrobiology Learning Progressions Table of Contents

Lower Grade
Grades K-2 or Adult Naive Learner
Storyline
NGSS Connections for Teachers
Concept Boundaries for Scientists
Resources

Have you ever wondered what we’re made of? Would you believe that you and me and all of the plants and all of the animals that we can see are all made of the same kind of stuff that makes up books and rocks and mountains and the ocean? We’re all made from the same kind of stuff, and we call that stuff “matter”.

Can you guess where all of that matter came from? It was all made in space! A lot of that stuff, that matter, that makes up you and me and the place we live was made inside of stars long, long ago. So long ago that it’s older than your parents, it’s older than the dinosaurs, and it’s even older than the Sun!

We’re all made of the stuff from stars!

Grades 3-5 or Adult Emerging Learner
Storyline
NGSS Connections for Teachers
Concept Boundaries for Scientists
Resources

The story of where we came from begins in space, a long time ago, even before the Sun and the planets in our solar system formed. A lot of the stuff, the matter, that makes up you and me and everything we see on Earth was formed inside of stars long ago.

Sometimes when we talk about stars, we talk about them as if they’re living things. So, we’ll say that a star is born, it has a life, and then it dies, and we call this a “lifecycle.” A star can be born and start its lifecycle when a bunch of dust and ice in space comes together. Just like when you drop something and it falls because of gravity, when there’s a lot of ice and dust together in space, it can fall together because of gravity and make up an entire star. Once a star is born, it starts to make light and heat, just like our Sun, and, when that happens, it starts creating different kinds of matter inside. Even right now, our own Sun is making new kinds of matter inside of it from the other stuff that’s there.

So, how does all of that new matter that’s created inside of stars get out and end up inside of you and me and other stuff? Well, some stars, when they get old and are at the end of their life cycles, will explode and send all of that new matter out into space. Then, later on, when new stars and new planets are forming, some of that new matter ends up in them. So, a lot of the matter that’s inside of our Sun and inside of our planet and even inside of us was made within stars long, long ago. That means that you are made of star stuff!

Grades 6-8 or Adult Building Learner
Storyline
NGSS Connections for Teachers
Concept Boundaries for Scientists
Resources

The story of where we came from begins in space. All of the matter that makes up you and me and everything we see on Earth is composed of the chemical elements. If you think about all of the elements on the periodic table, things like oxygen and carbon and iron, almost all of those elements that we see and that make up the stuff around us formed inside of stars long ago.

When the universe was young, before there were any galaxies or stars or planets, the only elements that existed were hydrogen and helium and a little bit of lithium. These are the first and lightest elements on the periodic table. As time passed in the universe, some of the earliest matter, the hydrogen and helium and lithium, started to clump together. When this happened, it made the first stars.

Something really cool can happen inside of stars, where nuclear fusion causes some of the lighter elements to come together, or fuse, and make heavier elements. So, things like hydrogen and helium were fused together to make heavier elements, like carbon, and then those heavier elements could make even heavier elements. Stars are like big chemical element factories! Even our Sun is currently making new elements inside of it.

All stars will eventually burn out, but, for some of the stars, when they burn out, or reach the end of their “life cycles” they can explode and blow out all of the heavy elements that were inside. These explosions can also make new, even heavier elements. Then, later on, when new stars and new planets are forming from the matter in space clumping together, some of these heavier elements end up in them. So, a lot of the matter that’s inside of our Sun and inside of our planet and even inside of us was made within stars long, long ago. That means that you are made of star stuff!

Grades 9-12 or Adult Sophisticated Learner
Storyline
NGSS Connections for Teachers
Concept Boundaries for Scientists
Resources

Our planet and all of the living things on it are made from matter. Most of that matter was created during the big bang and the rest was mostly created within the cores of ancient stars. During the big bang, all of the hydrogen, most of the helium, and some of the lithium in our universe was created from subatomic particles, like protons and neutrons. Protons and neutrons are sometimes called “nucleons”, since they are in the nuclei of atoms. Since protons and neutrons came together to make the nuclei of these lighter elements during the big bang, we call this process “big bang nucleosynthesis.”

It was this earliest matter, composed of the three lightest elements on the periodic table, that made the very first stars. And these stars were big and bright and they burned out really fast. We call them the “first generation stars”. It was within the cores of these first stars that the process of nuclear fusion first started creating elements heavier than lithium. The hydrogen and the helium inside were squeezed so tightly and with so much energy, that they started forming things like carbon and nitrogen and oxygen. Much like big bang nucleosynthesis, where new elements had been formed, we call this process of forming new elements from nuclear fusion within stars “stellar nucleosynthesis.”

When those first stars “burned up” their elemental “fuel,” they went through a process called “supernova”, where the stars explode and send a lot of their matter out into space. Then, new stars were able to form from the matter that had clumped up in space, including these heavier elements that were made from stellar nucleosynthesis. In this way, each new generation of stars will have more and more of the heavier elements inside of them.

Something really interesting is that the process of stellar nucleosynthesis can make all of the atoms on the periodic table up to iron (element number 26), but it actually takes too much energy to make the elements that are heavier than iron inside of stars. Can you guess where the energy might come from, then, to make all of those other elements we see in nature that are heavier than iron? When a star goes supernova and explodes, there’s actually enough energy to make those heavier elements! We call this process of nucleosynthesis “supernova nucleosynthesis”. Together, all of the various nucleosynthesis reactions explain how all of the matter that makes up our world and living things came to be. That’s why you’ll often hear people exclaim that we are made of star stuff!

Organic molecules are made in space:

Simple organic molecules, some of which are the building blocks in the biochemical pathways of life, can be produced in space! Astrochemists can use telescopes to observe a large variety of organic molecules within the dusty clouds in our galaxy. Researchers have discovered organic molecules on the surfaces of asteroids and comets. Even some meteorites have a large number of organic molecules within them, showing us that they formed in space. Also, some astrobiologists working in laboratories can emulate the energetically dynamic conditions in interstellar space. For instance, they can take icy mixtures of water, methanol, carbon dioxide, ammonia, and other simple compounds and expose them to UV radiation (as they would be exposed to from stars in space). What such scientists observe is the production of simple amino acids, which are the basic unit of proteins. Proteins are large biological molecules which serve as structural components in all life forms, as well as perform the majority of life’s functions including DNA replication and repair, metabolism (how a life form makes energy from food), and responding to stimuli: all of which are fundamental to an organism’s daily life. This work shows that chemistry that naturally occurs in space can lead to the production of biologically-relevant molecules. Since these reactions are thought to be occurring wherever new stars and planets are formed, this implies amino acids could be introduced to the surfaces of all newly formed planets, and this process could have played a key role in the origin of life on Earth.

We’re still learning about how nucleosynthesis works:


For a long time, it was thought that the only way to make elements heavier than iron was in supernova nucleosynthesis of the largest stars, but new research is suggesting that other types of supernovae and other types of stellar processes may also form many of the heavier elements. For instance, when neutron stars bounce into each other and merge (something we can observe using gravitational waves), the energy should be enough to form many of the larger elements as well. The astronomer Jennifer Johnson has recently updated a version of the periodic table of elements to show the potential stellar environments where elements form based on our current knowledge (The Origin of the Elements [ohio-state.edu]). However, as she points out, “we still don’t know everything.”

On top of other stellar ways to make heavy elements besides supernovae, we also know that some of the matter in our bodies and in rocks was formed more recently. For instance, cosmic ray spallation (or cosmic ray nucleosynthesis) is when cosmic rays bombard elements and cause them to make new elements. This is how a lot of our carbon-14 is formed. Cosmic rays that hit atoms of nitrogen-15 in our atmosphere can make carbon-14. Sadly, there’s also a lot of carbon-14 in our bodies and our atmosphere that came from nuclear weapons testing. This carbon is sometimes called “bomb carbon”.

Most of the hydrogen and helium in the universe was created in about 5 minutes:


Our current models of how the first elements formed tell us that all of the hydrogen and almost all of the helium in our known universe was all formed (in nuclei form) within about the first five minutes after the Big Bang and as the universe cooled, the nuclei gained their electrons and formed into atomic H and He. Even though nucleosynthesis inside of stars and from supernovae (and probably from other stellar events) has formed a lot of other elements since the Big Bang, most of the matter that we see in all of the stars and the galaxies we can observe is composed of those primordial atoms of hydrogen and helium.

 

An artist's impression of a black hole accretion disk. (Mark Garlick/Science Photo Library/Getty Images)
SPACE

Black Holes Could Be Inadvertently Making Gold, Astrophysicists Say

16 NOVEMBER 2021

The Universe may have more ways of forging heavy elements than we thought.

The creation of metals such as gold, silver, thorium, and uranium require energetic conditions, such as a supernova explosion, or a collision between neutron stars.

However, a new paper shows that these elements could form in the swirling chaos that rings an active newborn black hole as it swallows down dust and gas from the space around it.

In these extreme environments, the high emission rate of neutrinos should facilitate the conversion of protons to neutrons – resulting in an excess of the latter, required for the process that produces heavy elements.

"In our study, we systematically investigated for the first time the conversion rates of neutrons and protons for a large number of disk configurations by means of elaborate computer simulations, and we found that the disks are very rich in neutrons as long as certain conditions are met," said astrophysicist Oliver Just of the GSI Helmholtz Centre for Heavy Ion Research in Germany.

In the beginning, after the Big Bang, there weren't a lot of elements floating around. Until stars were born and started smashing together atomic nuclei in their cores, the Universe was a soup of mostly hydrogen and helium.

Stellar nuclear fusion imbued the cosmos with heavier elements, from carbon all the way up to iron for the most massive stars, seeded through space when the star dies.

But iron is where core fusion hits a snag. The heat and energy required to produce iron via fusion exceeds the energy the process generates, causing the core temperature to drop, which in turn results in the star dying in a spectacular kaboom – the supernova.

It's that spectacular kaboom (and the kabooms of colliding neutron stars) where the heavier elements are fused. The explosions are so energetic that atoms, colliding together with force, can capture neutrons from each other.

This is called the rapid neutron capture process, or r-process; it needs to happen really quickly, so that radioactive decay doesn't have time to occur before more neutrons are added to the nucleus.

It's unclear whether there are other scenarios in which the r-process can take place, but newborn black holes are a promising candidate. Namely, when two neutron stars merge, and their combined mass is sufficient to tip the newly formed object into the black hole category.

Collapsars are another possibility: the gravitational collapse of the core of a massive star into a stellar-mass black hole.

In both cases, the baby black hole is thought to be surrounded by a dense, hot ring of material, swirling round the black hole and feeding into it, like water down a drain. In these environments, neutrinos are emitted in abundance, and astronomers have long hypothesized that r-capture nucleosynthesis could be taking place as a result.

Just and his colleagues undertook an extensive set of simulations to determine if this is indeed the case. They varied the black hole mass and spin, and the mass of the material around it, as well as the effect of different parameters on neutrinos. They found that, if conditions are just right, r-process nucleosynthesis can take place in these environments.

"The decisive factor is the total mass of the disk," Just said.

"The more massive the disk, the more often neutrons are formed from protons through capture of electrons under emission of neutrinos, and are available for the synthesis of heavy elements by means of the r-process.

"However, if the mass of the disk is too high, the inverse reaction plays an increased role so that more neutrinos are recaptured by neutrons before they leave the disk. These neutrons are then converted back to protons, which hinders the r-process."

This sweet spot in which heavy elements are produced most prolifically is a disk mass between 1 and 10 percent of the mass of the Sun. This means that neutron star mergers with disk masses in this range could be heavy element factories. Since it's unknown how common collapsar disks are, the jury is still out for collapsars, the researchers said.

The next step will be to determine how the light emitted from a neutron star collision can be used to calculate the mass of its accretion disk.

"These data are currently insufficient. But with the next generation of accelerators, such as the Facility for Antiproton and Ion Research (FAIR), it will be possible to measure them with unprecedented accuracy in the future," said astrophysicist Andreas Bauswein of the GSI Helmholtz Centre for Heavy Ion Research.

"The well-coordinated interplay of theoretical models, experiments, and astronomical observations will enable us researchers in the coming years to test neutron star mergers as the origin of the r-process elements."

The research has been published in the Monthly Notices of the Royal Astronomical Society.

Modelling the Metaverse: The Fight to Profit from the Next Big Thing

By Pengana Capital 

November 26, 2021


Key Takeaways


• The hardware required for teleporting into the Metaverse is still clunky and the content scant, but the remaining technological hurdles would seem to have reached the stage where they can be solved with time and money.

• A concert in 2020 attended by 27 million avatars already suggests how some experiences could be just as good, if not better, experienced in the Metaverse.

• Valuing technology companies has always involved trying to get a firm handle on the future rewards that will come to those that succeed in exploiting totally new platforms.

• For Apple, the business model is straightforward: replicate in 3-D the experience of using the iPhone. For Facebook, the model involves using its own hardware as a cudgel for keeping a larger share of the advertising spoils.

CEOs spend a lot of time communicating their visions for their companies to a naturally skeptical public. Traditionally, this involves traveling around the country or world to interact face-to-face in conference rooms or auditoriums, or, over the past two years, logging into an endless series of video calls. But recently the top executive at a Prague-based technology startup tried presenting in another realm altogether.

This presentation came during a virtual conference hosted by JPMorgan titled “Ready Player One,” after the science-fiction novel by Ernest Cline about a society in the near-future when all of humanity uses a virtual simulation to escape the drab reality of their lives. The speaker was Artur Sychov, CEO of Somnium Space, one of the first open-source platforms designed specifically for virtual-reality users. Instead of wearing the usual business suit, Sychov appeared as a black-clad superhero-like avatar. Behind his avatar, you could see the sparse expanse of the Somnium landscape extending in every direction. The sun was just starting to peek over a virtual horizon, and a handful of buildings rose nearby including the dome of a planetarium.


Artur Sychov, CEO of Somnium Space. Source: PodFest Asia

For those logging in from home laptops and office workstations, Sychov provided an opportunity to experience a concept which has been a key talking point for many tech industry CEOs in 2021. That concept—now commonly called the Metaverse—is a totally immersive version of the internet where, instead of just viewing and listening to content, users feel as if they are in it. Initially built atop the existing internet, the Metaverse could eventually become so pervasive and so absorbing that it subsumes the current 2D web (at least that’s the idea), offering compelling facsimiles for almost every activity people can experience in the physical world, with the possible exceptions of eating and sleeping.

Recent evidence of tech companies’ intense focus on the Metaverse abounds. At a Microsoft conference in May, CEO Satya Nadella spoke at length about his company’s new “Azure Digital Twins” that allows its enterprise customers to create live always-on simulations of their products, facilities, processes, and people. Google has gone on a major hiring spree to beef up its engineering staff for augmented reality (AR)1 search. Apple has quietly been buying up early-stage AR and virtual reality (VR) startups in areas from advanced optics to virtual events.

Of the large-cap players, Facebook and its CEO Mark Zuckerberg have publicly been the most all-in on the concept, declaring that in the future people will associate Facebook more with the Metaverse than with social media, and pledging a major increase in capital expenditures to ensure that they do. In October 2021, the company even went so far as to change its name to Meta to reflect the seriousness of that ambition (and to try to move on from the controversies stirred by a whistleblower’s release of internal documents about the potential harmful effects of its current products). A time when the physical and virtual worlds blend into one has never felt more possible.

To a point anyway. Teleporting into the Metaverse currently requires bulky devices that resemble inches-thick blindfolds—not something many people would be comfortable wearing while working, even from home. The hardware problem was demonstrated by Sychov’s presentation: no one else on the call was wearing a VR headset. When he said that it was “hard to explain the degree to which my brain believes that I’m here, the beauty of that sunrise,” everyone had to take his word for it. His comments that “we will have cities, maybe whole countries” developing in virtual reality where people will work full time sounded even more fantastical.

However, there is another more-grounded perspective to consider on the Metaverse’s progression from science fiction to viable business model. Igor Tishin, an Information Technology analyst at Harding Loevner, has thought a lot about the Metaverse and its implications for big technology companies—Facebook, Apple, Google, and Amazon as well as chipmakers and related equipment suppliers like NVIDIA, TSMC, and ASML—whose growth depends on continually expanding the applications of their technology. Still-evolving areas of technology like the Metaverse may be highly speculative, but these companies’ share prices already reflect an expectation that such growth opportunities will become a reality, and not in a too-distant timeframe. “A lot of people get hung up on how we’re going to get from where we are today to there,” Tishin says. By “there,” he means the Metaverse. “I’m more sanguine about the ability, and motivation, of these companies to figure it out.”


A Brief History of the Metaverse


The concept of the Metaverse has been a source of fascination, and hype, since before there was a commercial internet. Neal Stephenson originally coined the term in his 1992 novel Snow Crash. In the novel, considered a classic by tech professionals, users entered this original “Metaverse” via a pair of goggles onto which a 3-D world was holographically projected with a blueish beam of light. So, in 2012, when Google co-founder Sergei Brin arrived at a charity event in San Francisco wearing a pair of goggles with an eerie bluish, laser-like spot covering his right iris, it sent shockwaves through the tech industry. Within two years, however, Google’s consumer AR goggle product, Google Glass, would become the company’s highest-profile failure. Other companies’ stumbles have followed. Magic Leap, a much-hyped AR company, has repeatedly fallen short of its soaring ambitions to, among other things, create a humanlike virtual assistant or city-sized holographic overlay. And forecasters have continued to overshoot. In 2016, one industry group declared the market for AR and VR would reach US$150 billion in 2020—they were off by US$117 billion.2

Yet there is still an undeniable feeling that something must eventually replace the handheld screen. The exact next step for consumer tech is still up in the air, but many futurists in Silicon Valley believe it will be some combination of devices and platforms that delivers the sense of “presence” described by Sychov. To date, VR has mainly been applied to gaming. But recently tech captains have talked up VR’s practical applications like remote work. In interviews, Zuckerberg has reported conducting more of his meetings via the company’s new Horizon Workrooms VR platform and how helpful it is not only to meet remotely but also to share a sense of space with other people that allows a better read on body language or wheeling around to the same side of a table when working through a problem with a colleague.

At this point, anyone able to join Mark Zuckerberg in the Metaverse will be doing it as an avatar—interacting cartoon-to-cartoon, as it were. Achieving the body scanning, processing speeds, high-resolution cameras, and lens/displays capable of real-time photo-realistic image generation is an essential precondition of the next stage of the development of the Metaverse. Streaming capacity will also need to expand. In April 2020, Fortnite developer Epic Games co-produced a Travis Scott concert attended by more than 27 million avatars, all putatively beamed into the same venue in the Metaverse. In actuality, current technology can only handle a hundred or so avatars meeting at once; a technique called “sharding” is required to bring together thousands of these avatar groups to create a mostly identical and near-simultaneous communal digital experience. For billions to move about freely together in the Metaverse, the shards will somehow need to fuse.

Overcoming such hurdles is not a trivial endeavor. Nevertheless, Tishin says, “my suspicion is that we’re solidly in the realm of an engineering problem as opposed to a scientific problem.” The latter, like commercially viable nuclear fusion, cannot be solved without some fundamental breakthrough. With an engineering problem, the breakthroughs have already occurred, and the remaining challenges can be solved with time and money.


Next Steps


For now, the Metaverse is not a unified phenomenon, but a loose collection of devices, digital environments, and related technologies. Microsoft has spoken of its Digital Twins as an “enterprise Metaverse.” Google, as it happens, never fully abandoned Google Glass. It has continued to develop and sell the eyewear for enterprise customers, including the shipper DHL, whose workers use the glasses for item picking at its warehouses. The data and images that appear in the pickers’ field of vision, too, form an emerging Metaverse. Somnium Space is clearly a Metaverse, but so is Decentraland, where, earlier this year, parcels of digital land were going for the cryptocurrency equivalent of hundreds of thousands of dollars. Some users experience Decentraland in VR, but most simply choose to use the 2-D interface of their tablet or smart phone.

There will come a time, however, when all these individual platforms will merge to form the Metaverse. At least, that’s the vision articulated by venture capitalist Matthew Ball in a pair of widely circulated essays in May 2020 and June 2021.3 Ball elucidated some half dozen different developments (beyond the required engineering advancements) that need to occur before an all-encompassing Metaverse can be realized. One of them is a set of standard programming protocols—like HTML with which the web is built—to provide interoperability between platforms. Another is an integrated, functioning economy within the Metaverse that matches the freedom from the physical world’s constraints with a system— presumably relying on some version of blockchain—that establishes immediate and immutable provenance over economic goods and services.

No one can say with precision how long it will take for these pieces to fall into place or how big the economic opportunity could be. “The entire global economy is roughly US$84 trillion now,” Tishin observes. “In 10 years, it will be way over US$100 trillion. What percentage of that will have moved into the Metaverse? It’s clearly not all of it, or probably even most of it, but if it’s even one-tenth, you’re talking several trillions of dollars.”

During his “Ready Player One” conference presentation, Sychov tried to give a sense of how he sees the business opportunity unfolding in his corner of the Metaverse. Currently, Somnium Space generates most of its revenue from “primary” sales, including plots of virtual land as well as the commissions it charges on avatars,4 virtual cars, and other non-fungible tokens (NFTs). However, he projects the company’s revenues will increasingly come from the cut it takes on transactions between users when they exchange goods, services, games, and even whole sub-worlds created within its platform. As the user base grows from the couple thousand avatars a month who interact inside its virtual world now, he expects advertising naturally to follow. Already, Sony has erected a pop-up shop in Somnium Space to advertise its VR headsets.

Sychov predicts that it won’t be long before major brands catch on to the possibilities of taking things a step further. While commerce in the Metaverse at the early stages is likely to focus on virtual advertising for physical goods (a glorified extension of the internet, in other words), the next stage will be a shift to advertising for and selling virtual ones.

At one level, it is a preposterous (and deeply unsettling) idea that eventually many of us will be spending so much of our time inside the Metaverse that a company like Nike, for example, could be earning a significant portion of it profits producing shoes for avatars instead of physical sneakers. Tishin, however, isn’t ready to dismiss the possibility outright. “We’ve already seen with these concerts how some types of experiences could be just as good, if not better, experienced in the Metaverse. Travel could be another. Theme parks. The thrill of driving a really fast car. So, you’re likely to see the virtual economies working in parallel with the real ones.” There are important environmental and societal implications to what he is saying. With billions of people around the world rising into the middle class or else demanding more equal access to goods, services, and lifestyles, it may be more preposterous to assume the physical world can continue housing all those hopes and desires on its own.


Modeling the Metaverse


Of course, for the Metaverse to become the grand parallel economy Tishin describes, the devices will need to be so capable that, like smartphones today, they become essential tools to most consumers. Companies, meanwhile, will need to demonstrate the power of the platforms so they can attract developers to build experiences that draw in customers. As Tishin points out, the incipient Metaverse is at a common juncture in the development of new technology. Software like Netscape and Internet Explorer had to be written and proven useful for web browsing before they triggered the content explosion upon which our current 2-D internet is based. Amazon had to spend vast sums making the Kindle into a capable device before publishers took e-books seriously. Once they did, Amazon was able to cash in on the real prize, which was selling e-book content, not Kindles.

The history of technological development is important context for assessing the value of tech companies. Investors generally value companies based on projections for the rate of growth of their current businesses, but valuing tech companies also involves trying to get a financial handle on just this process: the future rewards that will come to those companies that succeed in exploiting totally new platform paradigms.

According to Tishin, two of the Metaverse’s biggest winners will likely be Facebook and Apple. Compared to Facebook, Apple has been relatively silent about its plans, “which suggests how serious they are,” says Tishin. Apple is the one company with the expertise and control over the entire technology stack to solve the engineering challenges in creating a viable Metaverse. The company has powerful, well-designed proprietary hardware, from chips to the devices themselves. These devices, running on Apple’s own operating system, form an ecosystem with over one billion users. And Apple has a straightforward business model for the Metaverse: to replicate in 3-D the experience of using an iPhone.

Tishin estimates what the impact of the Metaverse should be on Apple’s valuation by starting with the iPhone. Today, the iPhone alone generates about US$200 billion in revenue annually. In Tishin’s valuation model, he projects that in five years Apple will have an additional business selling goggles and glasses with AR/VR augmentation whose revenues will be just short of what the iPhone generates today—call it US$150 billion a year. To back the impact of that new business into his current estimation of Apple’s fair value, Tishin multiplies US$150 billion by Apple’s 37% gross margins, then multiplies that number by 10 (a conservative stock price multiple), and then again by his ballpark 75% probability that the whole effort actually materializes. When he divides that number (US$416 billion) by Apple’s 16.5 billion outstanding shares, he comes up with an added value from the hardware for accessing the Metaverse of US$25 per share by mid-decade. That’s one reason that Apple’s recent share price of around US$140 a share, at 28 times trailing earnings, actually strikes Tishin as pretty reasonable, even with the potential for some cannibalization of its existing hardware business.




An artist’s rendering of Apple’s planned VR/AR headset based on leaked documents and insider reports. It reportedly will hit the market in 2022 at a price of US$3,000. That compares to US$300 for the bulkier Facebook Oculus, suggesting the device will largely be used as a proving ground for even sleeker glasses. For its part, the company has only acknowledged it’s working on both projects. Source: Antonio DeRosa (aderosa.myportfolio.com)

Modeling the value of Facebook’s future in the Metaverse is a little more complicated. In a recent earnings call, Zuckerberg suggested that the company will spend over US$10 billion on Metaverse development in 2021. The only significant revenue those efforts are generating so far (annualized at US$3 billion in sales) is from its Oculus headset, widely regarded as the leading VR gadget on the market. In partnership with Ray-Ban, Facebook also recently unveiled its “smart glasses,” which are presently only smart enough to take photos, record videos, play music, and answer phone calls, but according to Facebook could soon deliver on some rudimentary aspects of the Metaverse in a stylish, user-friendly form factor. Zuckerberg has been careful to note, though, that none of the new hardware will necessarily change the way the company earns its profits—through advertising.

For a glimpse of how the company’s future advertising profits may play out, Facebook insiders have directed Tishin to Facebook Shops. This initiative, launched in partnership with the e-commerce platform Shopify, is Facebook’s attempt to build a marketplace that can rival Amazon, particularly for the new mode of “inspirational” shopping that is driven by social interactions and recommendations from friends. The Shops platform lets you use your phone’s video function to view a selfie of your face and “try on” a shade of lipstick that’s caught your eye to see how it looks with your skin tone. The company envisions a time when the lipstick—or your old college roommates, or your dermatologist—will appear with a simple wave of everyone’s wrist. In any event, all are more value-added ways to pull people into the company’s orbit so it can keep serving them ads.



AR lipstick on Facebook Shops. Source: Facebook

So how much is the Metaverse potentially worth to Facebook? Applying the same logic that he does to Apple—that in time the company will have a separate Metaverse-related business comparable to the size of its core business today—and accounting for a similar level of cannibalization, Tishin figures that in five years Facebook could be generating 50% more in ad revenues than it would be without the Metaverse. But he also thinks Facebook may be playing down the full scope of its monetization plans. “What if its platform becomes one of the main places where hundreds of millions of people around the planet are working, buying virtual tennis shoes, spending the afternoon in Paris. Wouldn’t it want to start taking a cut of developers’ revenue for bringing those experiences to people? In which case, the revenue opportunity could increase by an order of magnitude.”

This raises the question of why Facebook would spend time and money fiddling around with headsets. Although Facebook pulls in a fraction of Apple’s top line, its gross profit margins, which top 80%, dwarf those of its Cupertino rival. And Apple is the most efficient hardware maker in the world. But, Tishin says, Facebook could face a problem. If Apple does succeed in building the world’s best Metaverse hardware and operating system, it could also be able to set the fee structure for those who want to use its enabling platform.

Apple’s track record in that area can’t give Zuckerberg much peace of mind. The up to 30% commissions Apple charges on sales in its App Store have long been decried by developers as usurious. Recent concessions to app developers and a September court decision in a lawsuit by Epic Games suggest that Apple’s commission policies may be on the wane. But Apple has found a clever way both to deflect attention from its usurious practices and to create an alternative source of revenue: supporting the push for greater privacy protections. The safeguards it has implemented to prevent the data of iPhone users’ from being shared with third-party advertisers have severely crimped the ability of Facebook and Google to serve those users ads, effectively siphoning a not-insignificant portion of ad revenue to … Apple. “I don’t think every company is going to have exactly the same vision here,” Zuckerberg said recently, clearly a veiled reference to the potential for Apple’s current walled garden to carry over into the Metaverse. “I think some are going to have more siloed visions, and I, at least, believe that in order for this to work really well, you want it to be very portable and interconnected.”

Tishin sees Facebook’s hardware efforts (and the implicit threat that it could build out its own operating ecosystem that competes directly with Apple’s future Metaverse i-wearables business) as chiefly a form of leverage. “I could easily see Zuckerberg going to Apple CEO Tim Cook at some point and saying, ‘Look, we don’t want to keep making headsets and sunglasses, we’d much rather stick to selling ads, but you need to give us a break on the cut you take from people and businesses having entered the Metaverse through your wormhole.’”

As Tishin watches the Big Tech players arming themselves with the technology and strategies they believe will help them win, he is reminded of massively multiplayer online games (which, by the way, would be awesome inside the Metaverse). “All those warriors show up for the ultimate battle and, at first, there’s 2,000 of them. But then when it’s down to, like, six, they have to say to one another: Let’s stop. We can have a really nice life if we compromise instead of try to fight.”


INVESTMENT FUND DETAILS
The Pengana Harding Loevner International Fund is an actively managed portfolio of listed international securities that seeks to obtain returns greater than the MSCI All Country World Total Return Index (net) in $A over the medium to long term.

Contributors

Analyst Igor Tishin, PhD contributed research and viewpoints to this piece.

Endnotes

1 Where VR refers to a wholly virtual experience, AR involves the layering of the virtual atop the physical.

2 “Augmented Reality Market Size, Share & Trends Analysis Report by Component, by Display (HMD & Smart Glass, HUD, Handheld Devices), by Application, By Region, and Segment Forecasts, 2021-2028,” Grand View Research, February 2021.

3 “The Epic Games Primer: Parts I-VI Directory” (May 22, 2020); “The Metaverse Primer” (June 29, 2021), MathewBall.vc.

4 Artur Sychov reports having paid a creator US$2,000 for his avatar.
The dubious track record of ‘new’ nuclear technology

Will safety and economic problems that shut down similar reactors follow the industry to Wyoming?

Opinion
by Kerry Drake
November 30, 2021


A friend of mine says he has no problem with nuclear energy or “experimental” projects, but not when they are combined and linked to Wyoming.

I think it’s a sentiment many share. Residents know the state must find a replacement for fossil fuels to drive its economy, but offering the Equality State as the proving ground for new, “advanced” nuclear technology feels too risky.

Count me as a member of this group. That puts me at odds with advocates for the Natrium nuclear demonstration project — like state executives or Kemmerer officials anxious to save their town, which was chosen for the $4 billion facility. They say Wyoming needs to roll the dice and rake in the riches.

Such unfettered optimism is alluring. It’s easy to admire the vision of TerraPower founder Bill Gates: nuclear power plants replacing coal-fired facilities across the country and thus cutting planet-destroying carbon emissions. I understand why many want to jump on this bandwagon.

I don’t generally view myself as a “not-in-my-backyard” kind of guy. In this instance, though, I think it’s imperative to scrutinize the project at every stage.

There are myriad reasons for this cautious approach. Yet federal and state officials want to remove regulatory barriers and go full-speed ahead.

Safety heads the list. Yes, nuclear energy has been part of the nation’s electrical generation mix for decades, but Gates claims the Natrium sodium-cooled fast reactor, with molten salt-based energy storage, will produce less nuclear waste and be safer than a conventional light-water reactor.

The Union of Concerned Scientists, however, issued a report in March that sharply questions such assertions.

Sodium-cooled fast reactors, the report said, would likely be less “uranium-efficient” and not reduce the amount of waste that requires long-term isolation in a geologic repository. Sodium coolant can burn when exposed to air or water, and the organization said a demonstration project like the one proposed in Kemmerer “could experience uncontrollable power increases that result in rapid core melting.”

Gulp. If that scary scenario doesn’t warrant pumping the brakes in Wyoming, I don’t know what would.

But there are other important safety factors that also deserve attention. The Nuclear Regulatory Commission, under pressure to find carbon-free solutions to combat climate change, has been pushing to streamline safety and environmental reviews for advanced nuclear projects. The Wyoming Outdoor Council noted that the NRC is considering using a “generic” environmental impact statement, instead of detailed reviews of specific sites.

“We shouldn’t be cutting corners on health and safety to rush through an untested technology, and Wyoming’s harsh climate and seismic activity underscore the need for site-specific reviews,” WOC warned.

Wyoming officials, led by Gov. Mark Gordon, have enthusiastically endorsed the Natrium project, treating it purely as an economic windfall. But not only is there no guarantee such a boom will happen. In fact, other states’ and nations’ experiences strongly suggest it won’t.

In an open letter to Gates, Arnie Gunderson, a nuclear engineer who has spent 50 years working in the industry, called out the billionaire for leveraging his fortune “to siphon precious taxpayer funds supporting your latest atomic contrivance in Wyoming.”

TerraPower has a huge financial stake in the Kemmerer project, committing a total of $2 billion. It’s money that the corporation won’t recoup unless Natrium proves to be a reliable, safe alternative to fossil fuels that can be efficiently replicated at other facilities.

What happens to Kemmerer and Wyoming if TerraPower’s critics are right and the plug is ultimately pulled for safety or economic reasons?KERRY DRAKE

But without the backing of the federal government, Gates likely wouldn’t be in the game. That’s because nuclear reactors typically run way over budget, if they’re completed at all. The 345-megawatt Natrium project was originally announced as a $1 billion facility, but the price tag has quickly quadrupled.

President Joe Biden’s administration, anxious to meet international goals to halt climate change, is all-in on Natrium. The Department of Energy is matching Gates dollar-for-dollar to develop “new” sodium-cooled reactor technology that has failed since it was first introduced on the U.S. submarine Seawolf in the 1950s.

Other attempts include the privately financed Fermi 1 site near Lake Erie, which was shut down in 1966 after a partial meltdown; and at Clinch River in Tennessee. The latter, a publicly funded project, was plagued by delays and huge cost overruns, and Congress finally shut it down in the mid-1980s due to serious safety concerns.

In his letter to Gates, Gunderson detailed the epic failure of the Monju sodium-cooled fast breeder reactor in Japan. It took a decade to build and was shut down in 1995 after four months due to a sodium leak and fire. Monju wasn’t reopened until 2010, and permanently halted a year later after a refueling accident.

The cost to the Japanese government? More than $11 billion. Meanwhile, after decades of expensive research, France’s nuclear agency has shelved plans to build a prototype sodium-cooled nuclear reactor.


“So now is the time to stop the Natrium marketing hype and free up those precious public funds to pursue low-cost and dependable renewable energy during the time frame necessary to help prevent catastrophic climate crises!” Gunderson concluded.

Many scientists agree. “The recent attention on nuclear energy is fully driven by the declining industry’s desperation for capital and its related lobby depicting it as a solution for climate change,” Jan Haverkamp of Greenpeace told Deutsche Welle, a German news organization. “… It does so too late and at a far too high cost.”

The Kemmerer project at the Naughton coal-fired plant, owned by Rocky Mountain Power, has a timeline that will be difficult to meet unless every phase goes off without a hitch — something that’s nearly unheard of in the nuclear industry. If the Nuclear Regulatory Commission approves permits, construction would begin in 2024, with the reactor in service only four years later.

PacifiCorp, Rocky Mountain Power’s parent company, estimates that it will take about 2,000 workers to build the plant in Kemmerer and 250 to operate it. It presents the ultimate lottery ticket for a small mining community whose future looked mighty bleak when PacifiCorp announced plans to close the coal-fired plant.

So, here’s today’s $4 billion question: What happens to Kemmerer and Wyoming if TerraPower’s critics are right and the plug is ultimately pulled for safety or economic reasons?

Wyoming will survive, but not without cost. It will have lost all the time and energy the state could have devoted to finding long-term solutions to diversify its economy and stop relying on fossil fuels.

And Kemmerer? The town has experienced so much uncertainty in recent years, I hope it would use any reprieve from the nuclear industry to its advantage. State government must step in and require TerraPower to pay enough for mitigation to ensure that Kemmerer won’t be left in the lurch.

If Gates is going to ride into town promising to be its economic savior, Wyoming needs to hold him to it. Don’t make us sic the posse on you, pardner.



KERRY DRAKE
Veteran Wyoming journalist Kerry Drake has covered Wyoming for more than four decades, previously as a reporter and editor for the Wyoming Tribune-Eagle and Casper Star-Tribune. He lives in Cheyenne and... More by Kerry Drake