Tuesday, August 22, 2023

Want to find UFOs? That's a job for machine learning

Want to find UFOs? That's a job for machine learning
Cockpit video shows an anomalous aerial encounter in 2015. Credit: U.S Navy Video

In 2017, humanity got its first glimpse of an interstellar object (ISO), known as 1I/"Oumuamua, which buzzed our planet on its way out of the solar system. Speculation abound as to what this object could be because, based on the limited data collected, it was clear that it was like nothing astronomers had ever seen. A controversial suggestion was that it might have been an extraterrestrial probe (or a piece of a derelict spacecraft) passing through our system.

Public fascination with the possibility of "alien visitors" was also bolstered in 2021 with the release of the UFO Report by the ODNI.

This move effectively made the study of unidentified aerial phenomena (UAP) a scientific pursuit rather than a clandestine affair overseen by government agencies. With one eye on the skies and the other on orbital objects, scientists are proposing how recent advances in computing, AI, and instrumentation can be used to assist in the detection of possible "visitors." This includes a recent study by a team from the University of Strathclyde that examines how  paired with machine learning can create an advanced data pipeline.

The team was led by Massimiliano Vasile, a professor of mechanical and , and was composed of researchers from the schools of Mechanical and Aerospace Engineering and Electronic and Electrical Engineering at the University of Strathclyde and the Fraunhofer Center for Applied Photonics in Glasgow.

A preprint of their paper, titled "Space Object Identification and Classification from Hyperspectral Material Analysis," is available online via the pre-print server arXiv and is being reviewed for publication in Scientific Reports.

This study is the latest in a series that addresses applications for hyperspectral imaging for activities in space. The first paper, "Intelligent characterization of space objects with hyperspectral imaging," appeared in Acta Astronautica in February 2023 and was part of the Hyperspectral Imager for Space Surveillance and Tracking (HyperSST) project. This was one of 13 debris mitigation concepts selected by the UK Space Agency (UKSA) for funding last year and is the precursor to the ESA's Hyperspectral space debris Classification (HyperClass) project.

Their latest paper explored how this same imaging technique could be used in the growing field of UAP identification. This process consists of collecting and processing data from across the  from single pixels, typically to identify different objects or materials captured in images. As Vasile explained to Universe Today via email, hyperspectral imaging paired with machine learning has the potential for narrowing the search for possible technosignatures by eliminating false positives caused by human-made debris objects (spent stages, defunct satellites, etc.):

"If UAP are space objects, then what we can do by analyzing the spectra is to understand the material composition even from a single pixel. We can also understand the attitude motion by analyzing the time variation of the spectra. Both things are very important because we can identify object by their spectral signature and understand their motion with minimal optical requirements."

Want to find UFOs? That's a job for machine learning
Credit: Vasile, M. et al. (2023)

Vasile and his colleagues propose the creation of a data processing pipeline for processing UAP images using machine learning algorithms. As a first step, they explained how a dataset of time-series spectra of space objects is needed for the pipeline, including satellites and other objects in orbit. This includes debris objects, which means incorporating data from NASA's Orbital Debris Program Office (ODPO), the ESA's Space Debris Office, and other national and international bodies. This dataset must be diverse and include orbital scenarios, trajectories, illumination conditions, and precise data on the geometry, material distribution, and attitude motion of all orbiting objects at all times.

In short, scientists would need a robust database of all human-made objects in space for comparison to eliminate false positives. Since much of this data is unavailable, Vasile and his team created numerical physics simulation software to produce training data for the machine learning models. The next step involved a two-pronged approach to associate a spectrum to a set of materials generating it, one based on machine learning and one based on a more traditional mathematical regression analysis used to determine the line of best fit for a set of data (aka. least square method).

They then used a -based classification system to associate the probability of detecting a combination of materials with a particular class. With the pipeline complete, said Vasile, the next step was to run a series of tests, which provided encouraging data:

"We ran three tests: one in a laboratory with a mockup of a satellite made of known materials. These tests were very positive. Then we created a high-fidelity simulator to simulate real observation of objects in orbit. Test were positive and we learnt a lot. Finally we used a telescope and we observed a number of satellites and the  station. In this case, some tests were good some less good because our material database is currently rather small."

In their next paper, Vasile and his colleagues will present the attitude reconstruction part of their pipeline, which they hope to present at the upcoming AIAA Science and Technology Forum and Exposition (2024 SciTech) from January 8th to 12th in Orlando, Florida.

More information: Massimiliano Vasile et al, Space Object Identification and Classification from Hyperspectral Material Analysis, arXiv (2023). DOI: 10.48550/arxiv.2308.07481

 

Floating seismometers could help peer into the core of Venus

Floating seismometers could help peer into the core of venus
Artist’s depiction of a series of balloons on Venus. Credit: NASA

Seismology has been ubiquitous on Earth for decades, and missions such as InSight have recently provided the same data for the inside of Mars. Understanding a planet's inner workings is key to understanding its geology and climate.

However, the inner workings of Venus, arguably our closest sister planet, have remained a mystery. The sulfuric acid cloud and scorching  probably don't help. But Siddharth Krishnamoorthy from NASA's Jet Propulsion Laboratory and Daniel Bowman of Sandia National Laboratory think they have a solution—use seismometers hanging from balloons.

As reported previously, the idea has been around for a while. However, it might seem counter-intuitive—don't seismometers usually have to sit on the ground to detect something? Typical seismometers do, yes. However, another type of seismometer is only now becoming more accepted. An infrasound  monitors infrasound pressure waves created by seismic activity transmitted through a medium other than the ground—like an atmosphere.

Venus has plenty of atmosphere to go around, so it sounds like an ideal place to use the technology. Even better, high up in its cloud layer is one of the places in the  most like the environment on Earth hence why there have been plenty of ideas to form "cloud cities" on Venus. The work is published in the journal Geophysical Research Letters.

It's unnecessary to build an entire city simply to host some sensors collecting infrasound data, so a high-altitude balloon would serve the purpose nicely. That solves one of the most difficult challenges of exploring Venus—developing materials that can survive on its surface.

NASA has spent millions of dollars developing radiation-hardened sensors that can withstand the extraordinary pressure and temperatures on the surface. But even they are relatively simplistic, so a sensor held aloft at a reasonable temperature and pressure wouldn't require any additional development efforts for that specific use case.

Floating seismometers could help peer into the core of venus
Artist’s depiction of a balloon with sensor in Venus’ atmosphere. Credit: NASA / JPL-Caltech

This begs the obvious question—how will the signal get from the ground to the sensors floating in the atmosphere? Earthquakes (or venusquakes, as in this case) cause deafening sounds, which are then transmitted through the atmosphere at low frequencies. Sensitive microphones, like the ones aloft in the balloon, could read these signals.

Such an experiment was recently carried out on Earth, where an infrasound microphone could pick up signals from two earthquakes of magnitude 7.3 and 7.5, despite being 3,000 km away from the epicenter of the earthquakes up in the stratosphere.

Using this experiment as a template, researchers could develop a similar system for use on Venus, with the associated changes necessary for that particular planet's environment. Also, it was much farther away than previous experiments done with infrasound seimometers, and much closer to the actual distance any such sensor would be in from the epicenter of a venusquake.

There are still plenty of challenges, though. One is that we haven't been able to successfully launch a balloon mission to Venus at all, let alone one with sensitive seismometers on it. Second is that, in the case of the Earth experiment, we had a "ground truth," i.e., the researchers knew from other sources there was an earthquake happening when they received the signal.

Since there are no other sensors capable of providing that validation on Venus, researchers would likely have to speculate on what caused a particular pattern in the data—it could be a venusquake, or maybe the balloon was jostled in a certain way.

In addition, earthquakes with magnitudes above seven are considered large here on Earth, and it's unclear if the seismometers could pick up smaller quakes, even here on our home planet. Venus might have the same range of resultant , or it could be even more active but with less intensity, making the detection of less powerful quakes a priority. The JPL team has picked up aftershocks as low as magnitude 4.2, however the balloon as that point was much closer than a few thousand kilometers away.

Using technologies developed on Earth in  is always a good idea, and this seems to be a novel way of using these seismometers in a novel way.

However, as of now, there are no plans for a mission utilizing these features, despite almost a dozen planned missions to Venus in the near future. For now, understanding the interior workings of our nearest sister planet will have to wait.

More information: Siddharth Krishnamoorthy et al, A "Floatilla" of Airborne Seismometers for Venus, Geophysical Research Letters (2022). DOI: 10.1029/2022GL100978


Scientists introduce the design and application prospect of China's Tiangong space station

by Beijing Institute of Technology Press Co., Ltd
Configuration diagram of Tiangong space station. Credit: Space: Science & Technology

China's manned space project was approved in 1992 with a three-step strategic plan, and building a space station is the final goal of this plan.

In September 2010, China's manned space station project was officially established. On April 29, 2021, the Tianhe core module was successfully launched into orbit by the carrier rocket Long March-5B at the Wenchang Spacecraft Launch Site in Hainan.

As planned, by 11 launches and on-orbit missions, China has completed the assembly and started the official operation of the space station. With the completion and operation of the Tiangong space station, China will become the third country able to build and operate space stations independently after the Soviet Union and the United States.

In a research paper published in Space: Science & Technology, specialists from Institute of Spacecraft System Engineering, China Academy of Space Technology introduce the design of the Tiangong space station, summarize the technical characteristics of the Tiangong space station, and discuss the future application and development of the space station.

First, authors introduce the system scheme design of Tiangong space station. Overall, the Tiangong space station shows a T-shaped configuration, with the Tianhe core module in the middle and two experiment modules, namely Wentian and Mengtian, being assembled on the two sides respectively.

Configuration diagram of Wentian experiment module. Credit: Space: Science & Technology
Configuration diagram of Tianhe core module. Credit: Space: Science & Technology

The Tianhe core module consists of three parts: a node cabin, a life control cabin, and a resource cabin. It has a total axial length of 16.6 m and a maximum diameter of 4.2 m. The Wentian experiment module consists of three parts: a working cabin, an airlock cabin, and a resource cabin. It has an axial length of 17.9 m and a maximum diameter of 4.2 m.

In addition, a small robotic arm, an experimental platform, and a truss structure are arranged outside the module. The Mengtian experiment module consists of four parts: a working cabin, a cargo airlock cabin, a load cabin, and a resource cabin. It has an axial length of 17.9 m and a maximum diameter of 4.2 m.

Two deployable platforms for exposed payloads are installed outside the load cabin, unfolding in orbit after launch. Moreover, the Tiangong space station has three docking hatches: forward (mainly for docking with manned spaceship and the Xuntian space telescope), backward (mainly for docking with cargo spaceship), and radial ones (mainly for docking with manned spaceship).

As for the system function design, the control and propulsion system includes control moment gyros, a reaction control system, a propellant refueling system, and the Hall electric propulsion system. The energy system is equipped with a large area of flexible solar arrays. The manned environment system possesses complete regenerative and non-regenerative life-support systems.

The robotic arms are the large seven-degree-of-freedom robotic arm and the small seven-degree-of-freedom robotic arm, with rich sensors of visual measurement, joint torque, and end-effector force. The extravehicular activities are supported by both the node cabin of the Tianhe core module and the airlock cabin of the Wentian experiment module. The crew's residence is provided with an activity space of no less than 110 m3.

The authors then summarize technical characteristics of advanced nature. Generally, the technical characteristics of the Tiangong space station are mainly reflected in four aspects:The advanced design concept.

China adopts a unified design framework and technical system for the Tiangong space station, under which the GNC, energy, information, thermal control, and manned environment systems of all the three modules will be interconnected to work together. And, the space station integrates intermodule resources efficiently, thus improving its overall capacity and system reliability.

The high proportion of new technology.


Space robotic arms, physicochemically regenerative life support, and the large-area flexible solar arrays of this space station are all brand-new technologies applied in orbit for the first time in China with high application difficulty and development risk. Additionally, the Tiangong space station embodies a high degree of intelligence from both equipment and system levels.
The excellent cost-effectiveness ratio.

In line with China's national conditions and practical needs in terms of building costs and application benefits, the Tiangong space station is designed to be moderate scale, relative to the ISS and Mir. Moreover, having more economical and reasonable construction and operation, it will be superior in some aspects, such as the power generation efficiency, application support capability, building ways, replenishment demand.
The safe and efficient residence.

The Tiangong space station is designed in line with the human-oriented concept. The design not only ensures the safety and comfort of the crew in orbit but also focuses on supporting the effective exertion by the crew's intelligence.

More information: Xiang Wang et al, Design and Application Prospect of China's Tiangong Space Station, Space: Science & Technology (2023). DOI: 10.34133/space.0035

Provided by Beijing Institute of Technology Press Co., Ltd

Spanish astronomer discovers new active galaxy

Spanish astronomer discovers new active galaxy
Images of the newfound galaxy. Credit: Elio Quiroga Rodriguez (2023).

By analyzing the images of the Sombrero Galaxy obtained with the Hubble Space Telescope (HST), Elio Quiroga Rodriguez of the Mid Atlantic University in Spain, has identified a peculiar object, which turned out to be a galaxy hosting an active galactic nucleus (AGN). The finding was reported in a paper published August 11 on the pre-print server arXiv.

An AGN is a compact region at the center of a galaxy, more luminous than the surrounding galaxy light. Studies show that AGNs are very energetic due either to the presence of a black hole or star formation activity at the core of the galaxy.

Astronomers generally divide AGNs into two groups based on emission line features. Type 1 AGNs show broad and narrow emission lines, while only narrow emission lines are present in Type 2 AGNs. However, observations revealed that some AGNs transition between different spectral types; therefore, they were dubbed changing-look (CL) AGNs.

Sombrero Galaxy (also known as Messier 104 or NGC 4594) is an unbarred spiral galaxy located between the borders of the Virgo and Corvus constellations, some 31 million light years away. With a mass of about 800 billion , it is one of the most massive objects in the Virgo galaxy cluster. It also hosts a rich system of globular clusters.

Rodriguez has recently investigated HST images of the Sombrero Galaxy, focusing one particular object in its halo. He found that this object, previously classified as a globular cluster candidate, may be a barred spiral galaxy of the SBc type, with an AGN at its center.

"While studying HST images available on the HST Legacy website of the halo of M104 (HST proposal 9714, PI: Keith Noll), the author observed at 12:40:07.829-11:36:47.38 (in j2000) an object about four arcseconds in diameter. A study with VO tools suggests that the object is a SBc galaxy with AGN (Seyfert)," the paper reads.

The object is cataloged in the Pan-STARRS1 data archive as PSO J190.0326-11.6132. By analyzing the data from the Aladin Sky Atlas RGB Rodriguez found that PSO J190.0326-11.6132 is a galaxy with a dominant central arm, nucleus and possibly two  with hot young stars and dust. The astronomer proposes that the newfound galaxy should be named the "Iris Galaxy."

The study found that PSO J190.0326-11.6132 has a radial velocity at a level of 1,359 km/s. Rodriguez assumes that the object, if gravitationally bound to the Sombrero Galaxy, could be its satellite with an angular size of around 1,000 light years.

However, the author of the paper noted that if the Iris Galaxy is not associated with the Sombrero Galaxy, its distance may be some 65 million light years. In this scenario, the angular size of the newly detected should be about 71,000 light years.

The X-ray emission luminosity of the Iris Galaxy was measured to be approximately 18 tredecillion erg/s, assuming a distance of 65 million light years. Such luminosity indicates the presence of an , however further observations are required in order to determine whether this is a Type 1 or Type 2 AGN.

More information: E. Quiroga, A peculiar galaxy near M104, arXiv (2023). DOI: 10.48550/arxiv.2308.06187


Journal information: arXiv 


© 2023 Science X NetworkStudy investigates recent outburst in the active galaxy Markarian 1018


A new way to measure the expansion rate of the universe: Redshift drift

A new way to measure the expansion rate of the universe: redshift drift
Cosmological redshift depends upon a galaxy's distance. Credit: NASA/JPL-Caltech/R. Hurt (Caltech-IPAC)

In 1929 Edwin Hubble published the first solid evidence that the universe is expanding. Drawing upon data from Vesto Slipher and Henrietta Leavitt, Hubble demonstrated a correlation between galactic distance and redshift. The more distant a galaxy was, the more its light appeared shifted to the red end of the spectrum.

We now know this is due to cosmic expansion. Space itself is expanding, which makes distant  appear to recede away from us. The rate of this expansion is known as the Hubble parameter, and while we have a good idea of its value, there is still a bit of tension between different results.

One of the difficulties in resolving this tension is that thus far we can only measure cosmic expansion as it appears right now. This also means we can't determine whether cosmic expansion is due to  or a more subtle extension of Einstein's model. But as powerful new telescopes are built, we might be able to observe the evolution of cosmic expansion thanks to what is known as the redshift  effect.

The Hubble parameter has a value of about 70 km/s per megaparsec. This means if a galaxy is about 1 megaparsec away (about 3 million light-years), then the galaxy appears to be moving away from us at about 70 km/s. If a galaxy is 2 megaparsecs away, it will appear to recede at about 140 km/s. The greater a galaxy's distance, the greater its apparent speed.

Since the universe is still expanding, with each passing year a galaxy is a bit more distant, and that means its redshift should become slightly larger. In other words,  means that the redshifts of galaxies should drift more to the red over time.

A new way to measure the expansion rate of the universe: redshift drift
Theoretical redshift drift based on the standard model. Credit: ESO / ELT Science Case

This drift is extremely small. For a galaxy 12 billion light-years away, its apparent speed would be about 95% of the speed of light, while its drift would be just 15 cm/s each year. That's much too small for current telescopes to observe. But when the Extremely Large Telescope (ELT) starts gathering data in 2027, it should be able to observe this drift in time. Estimates are that after 5–10 years of precise observations, ELT should be able to see redshift drifts on the order of 5 cm/s.

While this will become a powerful tool in our understanding of the universe, it will take a lot of data and a lot of time. So a new paper, published on the preprint server arXiv, proposes a different method using .

The authors call this effect redshift difference. Rather than observing the redshift of a galaxy over decades, the team proposes looking for distant galaxies that are gravitationally lensed by a closer galaxy. Lots of distant galaxies are lensed by a closer galaxy between us and the distant one, but most lensed galaxies appear as a single distorted arc to the side of the foreground galaxy.

A new way to measure the expansion rate of the universe: redshift drift
How gravitational lensing can create multiple galaxy images. Credit: NASA/CXC/M.Weiss

But sometimes gravitational lensing can create multiple images of a distant galaxy. Since each image of the distant galaxy takes a slightly different path to reach us, the distance of each path is also slightly different. So instead of waiting decades for a galaxy to move farther away from us, we can get snapshots of the galaxy separated by years or decades. Each image would have a slightly different redshift, and by comparing these we could measure the redshift drift.

This is still beyond our current ability to detect. But while we are waiting for telescopes such as the ELT to come online, we can search for distant lensed galaxies with multiple images. That way when we do have the ability to detect  drift, we won't have to wait decades for the result.

More information: Chengyi Wang et al, The Redshift Difference in Gravitational Lensed Systems: A Novel Probe of Cosmology, arXiv (2023). DOI: 10.48550/arxiv.2308.07529

Fulvio Melia, Definitive test of theRh = ctuniverse using redshift drift, Monthly Notices of the Royal Astronomical Society: Letters (2016). DOI: 10.1093/mnrasl/slw157


Webb reveals intricate details in the remains of a dying star

Webb Reveals Intricate Details in the Remains of a Dying Star
New images from NASA’s James Webb Space Telescope of the well-known Ring Nebula
 provide unprecedented spatial resolution and spectral sensitivity. In the NIRCam 
(Near-Infrared Camera) image on the left, the intricate details of the filament structure of the
 inner ring are particularly visible in this dataset. On the right, the MIRI (Mid-InfraRed 
Instrument) image reveals particular details in the concentric features in the outer regions of
 the nebulae’s ring. Credit: ESA/Webb, NASA, CSA, M. Barlow (University College London)
, N. Cox (ACRI-ST), R. Wesson (Cardiff University).

NASA's James Webb Space Telescope obtained images of the Ring Nebula, one of the best-known examples of a planetary nebula. Much like the Southern Ring Nebula, one of Webb's first images, the Ring Nebula displays intricate structures of the final stages of a dying star. Roger Wesson from Cardiff University tells us more about this phase of a sun-like star's stellar lifecycle and how Webb observations have given him and his colleagues valuable insights into the formation and evolution of these objects, hinting at a key role for binary companions.

"Planetary nebulae were once thought to be simple, round objects with a single dying star at the center. They were named for their fuzzy, planet-like appearance through small telescopes. Only a few thousand years ago, that star was still a red giant that was shedding most of its mass."

"As a last farewell, the hot core now ionizes, or heats up, this expelled gas, and the nebula responds with colorful emission of light. Modern observations, though, show that most planetary nebulae display breathtaking complexity. It begs the question: how does a spherical star create such intricate and delicate non-spherical structures?"

"The Ring Nebula is an ideal target to unravel some of the mysteries of planetary nebulae. It is nearby, approximately 2,200 light-years away, and bright—visible with binoculars on a clear summer evening from the northern hemisphere and much of the southern. Our team, named the ESSENcE (Evolved StarS and their Nebulae in the JWST Era) team, is an international group of experts on planetary nebulae and related objects."

"We realized that Webb observations would provide us with invaluable insights, since the Ring Nebula fits nicely in the field of view of Webb's NIRCam (Near-Infrared Camera) and MIRI (Mid-Infrared Instrument) instruments, allowing us to study it in unprecedented spatial detail. Our proposal to observe it was accepted (General Observers program 1558), and Webb captured images of the Ring Nebula just a few weeks after science operations started on July 12, 2022."

"When we first saw the images, we were stunned by the amount of detail in them. The bright ring that gives the  its name is composed of about 20,000 individual clumps of dense molecular hydrogen gas, each of them about as massive as the Earth. Within the ring, there is a narrow band of emission from , or PAHs—complex carbon-bearing molecules that we would not expect to form in the Ring Nebula."

"Outside the bright ring, we see curious 'spikes' pointing directly away from the , which are prominent in the infrared but were only very faintly visible in Hubble Space Telescope images. We think these could be due to molecules that can form in the shadows of the densest parts of the ring, where they are shielded from the direct, intense radiation from the hot central star."

"Our MIRI images provided us with the sharpest and clearest view yet of the faint molecular halo outside the bright ring. A surprising revelation was the presence of up to ten regularly-spaced, concentric features within this faint halo. These arcs must have formed about every 280 years as the central star was shedding its outer layers. When a single star evolves into a , there is no process that we know of that has that kind of time period."

"Instead, these rings suggest that there must be a companion star in the system, orbiting about as far away from the central star as Pluto does from our sun. As the dying star was throwing off its atmosphere, the  shaped the outflow and sculpted it. No previous telescope had the sensitivity and the spatial resolution to uncover this subtle effect."

"So how did a spherical star form such a structured and complicated nebulae as the Ring Nebula? A little help from a binary companion may well be part of the answer."

 

Teamwork is not always the best way of working, shows new study

Teamwork is not always the best way of working—new study
Credit: Girts Ragelis/Shutterstock

Throughout the 21st century, teamwork has come to define the modern work environment. Driven by advances in communication technology, working collaboratively is, as management experts will tell you, how you harness the "collective intelligence."

Collective intelligence is often seen as greater than the sum of its parts: superior to the cumulative individual intelligence of the group's members. Capitalizing on it is said to improve task accuracy (finding better and more correct answers), and enhance task efficiency (finding good answers faster). This in turn leads to quicker and higher quality completion. In other words, when we work together, our performance improves. This has been one of the major factors shaping our modern societies.

At the same time, though, both research and popular idiom underline the limits inherent to the concept. If "two heads are better than one" suggests the benefits of collaboration, "too many cooks spoil the broth" suggests the opposite.

I led a recent study looking at whether training and team composition might affect how efficient people are when working together. We found that the benefits of collective intelligence can be outweighed by the cost of having to coordinate between team members.

The dynamics of teamwork

We designed an experimental study using an existing online citizen science project, Wildcam Gorongosa. Participants analyze webcam photos taken in Gorongosa National Park, Mozambique, to find and identify  and behavior.

We invited 195 members of the public to our lab in Oxford to participate. The experiment comprised two stages: training, then testing, which they did first on their own and then in teams of two. They had five subtasks to complete: detecting the presence of animals; counting how many there were; identifying what they were doing (standing, resting, moving, eating or interacting); specifying whether any young were present; and identifying the animals from 52 possible species (the option of "nothing here" was included, but not "I don't know").

We split the participants into two groups. One received targeted training with images similar to the test set. The other received general training with a diverse range of images.

We found the type of training did indeed affect their performance. For those with general training—the "generalists"—efficiency initially improved, but then declined, once they were tested on the specific set of test images. By contrast, those with targeted training—the "experts"—consistently maintained or improved their performance.

How performance changed during the training and testing stages:

Teamwork is not always the best way of working—new study
The average change in efficiency tracks the number of correct classifications per minute. Credit: Taha Yasseri, CC BY-NC-ND

To investigate the impact team dynamics would have, we then formed three types of group: these featured either two experts, two generalists, or a mixed pair.

Surprisingly, we found that neither two generalists nor a mixed group performed better than a single generalist working alone. Even two experts working together did not do better than a single expert.

How the groups' composition affected their efficiency:

Teamwork is not always the best way of working—new study
Efficiency varied over time depending on whether the work was carried out by mixed groups, groups of experts, or single experts. Credit: Taha Yasseri, CC BY-NC-ND

We also found that while having an expert in a group improved accuracy for the more complex tasks, it did not improve the group's efficiency. In other words, the team got more correct answers but took considerably longer to do so. And for simple tasks, there was no improvement in accuracy from having an expert. Ultimately, the time that team members lost in coordinating with each other outweighed the benefit of adding an expert to the group.

What can we say about the future of work?

Research has long shown that underperformance in a group is often due to what  term "process losses". The  of a team can, for example, be adversely affected by  and what cognitive scientists call "herding" effects, because these can lead to collective decisions being disproportionately influenced by a few members of the group who are less competent yet more confident.

Further, psychologists speak about "social loafing" to describe a person performing poorly because they are part of a group—they have the impression that others will do the job without them needing to contribute. When a large number of team members follow this strategy, it can result in the combined efforts of the team being even lower than the sum of individual efforts.

Research also shows the importance of social learning in the context of effective collaborative working, which our study highlights. The experimental method we implemented involved individual training sessions followed immediately by testing the teamwork—this precluded opportunities for people to learn by observing their coworkers' performance, and therefore one of the advantages of being part of the group during the learning process was eliminated.

The context in which teamwork and collaboration take place matters, as do the tools available for coordination between team members. As internet-based communication technologies are used not only for large-scale voluntary collaborative endeavors, such as citizen science projects, but also for remote working, it is important to recognize the potential effects of different  approaches and team dynamics.

When  don't have the chance to observe other workers and reap the advantages of social learning, and when communication is less efficient than face-to-face interactions, the costs and benefits in the teamwork equation can shift. Our research shows that this is even more pronounced when you're dealing with simpler tasks that don't require extensive creative problem-solving. Opting to work individually could indeed be a more viable approach.

The dynamics of teamwork—whether in the workplace or in the context of collective action—are complex. While collaboration offers benefits in specific contexts, it is essential to consider the trade-offs between time, accuracy and efficiency. Coordination comes at a cost.

This article is republished from The Conversation under a Creative Commons license. Read the original article.The Conversation

New AI boosts teamwork training








 

Predicting financial crises in e-commerce

e-commerce
Credit: Pixabay/CC0 Public Domain

A study in the International Journal of Computational Systems Engineering has investigated the e-commerce landscape and how it is affected by financial crises. The insights from the study offer a financial accounting crisis early warning system that companies might use to predict and pre-empt economic turmoil.

The global pandemic underscored the vulnerability of businesses and economies, making the need for astute financial foresight more crucial than ever. Xiaoyang Meng of the Accounting Institute at Jiaozuo University in Jiaozuo, China, has looked specifically at the impact on China and has devised a novel system that melds adaptability and prediction.

The approach uses partial least squares (PLS) analysis, a sophisticated data analysis technique, and integrates it with the backpropagation (BP) neural network. The model can then discern the indicators of impending  within the e-commerce sector. Meng has demonstrated the model's proficiency on  for 11 financially sound enterprises and nine that were teetering on the brink of financial crisis and shown that the model could reveal the early signs of financial distress with an accuracy surpassing 90% and for some tests an accuracy of 98%.

The implications of this research may well be far-reaching. In an era where economic turbulence threatens the stability of even the most robust business, Meng's PLS-BP model offers a grounded means to identify an imminent crisis and so put in place strategies that might avert it.

Meng acknowledges that the model as it stands has some limitations. While the early detection methodology offers good levels of precision, it is essentially a static approach. To better navigate real-world financial ecosystems, she proposes the integration of the model with system dynamics theory. This could potentially then offer a dynamic  capable of adapting to the ever-evolving intricacies of e-commerce.

More information: Xiaoyang Meng, Research on e-commerce neural network financial accounting crisis early warning model combined with partial least squares, International Journal of Computational Systems Engineering (2023). DOI: 10.1504/IJCSYSE.2023.132913

Provided by Inderscience 

Berlin expects: Zoo's panda pregnant, birth expected soon