It’s possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way (K.Marx, Letter to F.Engels on the Indian Mutiny)
Wednesday, May 28, 2025
Bed bugs are most likely the first human pest, new research shows
Credit: Photo by Felicia Spencer for Virginia Tech.
Ever since a few enterprising bed bugs hopped off a bat and attached themselves to a Neanderthal walking out of a cave 60,000 years ago, bed bugs have enjoyed a thriving relationship with their human hosts.
Not so for the unadventurous bed bugs that stayed with the bats — their populations have continued to decline since the Last Glacial Maximum, also known as the ice age, which was about 20,000 years ago.
A team led by two Virginia Tech researchers recently compared the whole genome sequence of these two genetically distinct lineages of bed bugs. Published in Biology Letters on Tuesday, May 28, their findings indicate the human-associated lineage followed a similar demographic pattern as humans and may well be the first true urban pest.
“We wanted to look at changes in effective population size, which is the number of breeding individuals that are contributing to the next generation, because that can tell you what’s been happening in their past,” said Lindsay Miles, lead author and postdoctoral fellow in the Department of Entomology.
According to the researchers, the historical and evolutionary symbiotic relationship between humans and bed bugs will inform models that predict the spread of pests and diseases under urban population expansion.
By directly tying human global expansion to the emergence and evolution of urban pests like bed bugs, researchers may identify the traits that co-evolved in both humans and pests during urban expansion.
A stairway graph (at left) shows that the genome-wide patterns of bed bug demography mirrors global human expansion, courtesy of Biology Letters 21: 20250061. The image of bed bugs is courtesy of Warren Booth.
“Initially with both populations, we saw a general decline that is consistent with the Last Glacial Maximum; the bat-associated lineage never bounced back, and it is still decreasing in size,” said Miles, an affiliate with the Fralin Life Sciences Institute. “The really exciting part is that the human-associated lineage did recover and their effective population increased.”
Miles points to the early establishment of large human settlements that expanded into cities such as Mesopotamia about 12,000 years ago.
“That makes sense because modern humans moved out of caves about 60,000 years ago,” said Warren Booth, the Joseph R. and Mary W. Wilson Urban Entomology Associate Professor. “There were bed bugs living in the caves with these humans, and when they moved out they took a subset of the population with them so there’s less genetic diversity in that human-associated lineage.”
As humans increased their population size and continued living in communities and cities expanded, the human-associated lineage of the bed bugs saw an exponential growth in their effective population size.
By using the whole genome data, the researchers now have a foundation for further study of this 245,000 year old lineage split. Since the two lineages have genetic differences yet not enough to have evolved into two distinct species, the researchers are interested in focusing on the evolutionary alterations of the human-associated lineage compared with the bat-associated lineage that have taken place more recently.
“What will be interesting is to look at what’s happening in the last 100 to 120 years,” said Booth. “Bed bugs were pretty common in the old world, but once DDT [dichloro-diphenyl-trichloroethane] was introduced for pest control, populations crashed. They were thought to have been essentially eradicated, but within five years they started reappearing and were resisting the pesticide.”
Booth, Miles, and graduate student Camille Block have already discovered a gene mutation that could contribute to that insecticide resistance in a previous study, and they are looking further into the genomic evolution of the bed bugs and relevance to the pest’s insecticide resistance.
Booth said the project is a good example of what happens when researchers “follow the science,” which he is afforded the space to do thanks in part to the Joseph R. and Mary W. Wilson endowment that supports his faculty position.
“It’s a great resource to have,” said Booth. “We are using it for work investigating the evolution of insecticide resistance and species spread using museum specimens collected from 120 years ago to our present-day samples. “I’m very lucky to have that freedom to explore.”
Columns 1 and 2 present the results using COD1 to measure firms’ borrowing costs, whereas Columns 3 and 4 show estimated results using COD2 to gauge the cost of debt. As reported, the estimators of Broadband are significantly positive in Columns 1 and 2, with significance at 1% levels, denoting that digital infrastructure leads to increased corporate borrowing costs. In addition, the estimators of Broadband are still significant and positive when we exploit COD2 to measure firms’ borrowing costs. Moreover, the magnitudes of the estimators on Broadband are also economically sizable. To be specific, digital infrastructure results in a 7.8% (9.17%) increase in firms’ borrowing costs measured by COD1 (COD2). Accordingly, our baseline result implies that digital infrastructure is statistically and economically significant and is positively related to firms’ borrowing costs, which supports that digital infrastructure is positively related to firms’ borrowing costs.
Credit: Yan Jiang (Shanghai University of Finance and Economics, China) Dayong Lv (Shanghai Lixin University of Accounting and Finance, China) Suyu Hao (Tongji University, China) Xiaokun Wei (Tongji University, China) Youyi Wu (The University of Chicago, Chicago, USA)
Background and Motivation
As broadband internet becomes essential to socioeconomic development, digital technology alters how companies operate and manage their financing. However, existing research mainly focuses on the impact of digital infrastructure on macroeconomic performance or corporate innovation, with limited attention paid to its direct effect on debt financing costs. The nature of China’s financial market—where firms rely heavily on bank loans—makes this issue especially relevant. Drawing on China’s 'Broadband China' policy, this study provides the first systematic analysis of how digital infrastructure affects corporate debt costs, filling key theoretical and empirical gaps.
Methodology and Scope
The study adopts a Difference-in-Differences (DID) approach, leveraging the quasi-natural experiment created by the phased implementation of the “Broadband China” policy from 2014 to 2016 across 120 pilot cities. This exogenous policy shock allows the identification of the causal impact of digital infrastructure on debt costs.
Key Findings and Contributions
Theoretical Advancement: This study is the first to reveal that digital infrastructure can increase corporate debt costs by intensifying market competition, challenging the traditional view that enhanced information transparency reduces financing costs.
Policy Insights: It finds that non-state-owned and smaller enterprises are more significantly affected by the competitive pressures, offering a basis for differentiated regulatory responses.
Methodological Innovation: This study integrates macro-level policy shocks with micro-level corporate data to demonstrate the impact of digital infrastructure on financing costs via intensified market competition.
Why It Matters
The findings hold dual significance for businesses and policymakers:
For Firms: Firms need to be aware of the potential hidden financing risks associated with digital infrastructure and adapt by streamlining their debt structures and strengthening their competitive edge.
For Policymakers: The rollout of digital infrastructure should be coordinated with anti-monopoly measures to prevent excessive competition from threatening the survival of SMEs.
The full text is available for download as a PDF and is intended to inform and inspire a wide range of readers.
Researchers at the University of Eastern Finland conducted a systematic review of 99 scientific publications that compared the flora or fauna of old-growth forests, managed forests and clearcut sites in boreal Europe. The reviewed studies showed large differences in the species communities inhabiting these forest types.
The studies analysed for the systematic review show that the species richness of full-canopy forests increases as the forest gets older. Clearcut sites are also species-rich, but they are inhabited by a distinct set of species in comparison to full-canopy forests.
The studies had examined several species groups that cover a large part of the flora and fauna inhabiting forests, including birds, epiphytes (i.e., lichens and bryophytes growing on tree trunks), fungi and insects inhabiting deadwood, soil fungi, ground-dwelling invertebrates, other insects and understory vegetation.
Birds, epiphytes and fungi and insects in deadwood were more diverse in older forests. Understory vegetation and insects other than those inhabiting deadwood were more diverse in younger forests, especially clearcut sites. There were few studies on soil fungi.
The flora and fauna characteristic of old-growth forests are associated with old or dead trees. Such species are rare or absent in forests in commercial use.
“In managed forests, trees are cut down before they grow old or die, and thus old and dead trees are scanty in them. Leaving a portion of the trees standing during clearcutting would be an effective solution, but only very few of these so-called retention trees are left in harvests,” says Doctoral Researcher Aleksi Nirhamo from the University of Eastern Finland.
Clearcut sites are inhabited by species associated with open habitats and young forests. Conditions on clearcut sites are very different from full-canopy forests, and thus they are inhabited largely by distinct sets of species.
Even though the flora and fauna of clearcut sites are rich in certain species groups, they are degraded habitats.
“Early stages of forest succession would be formed without clearcuts by disturbances such as storms. These kinds of habitats differ from clearcut sites for example by having huge amounts of deadwood that is missing from clearcut sites,” Nirhamo points out.
“Additionally, the treeless stage of succession is cut short by the planting of conifers, which also homogenises the tree species composition of the regenerating stand,” he continues.
At the moment, there is a great abundance of habitat in the Nordic countries for species inhabiting clearcut sites. For the species of old-growth forests, the situation is worse.
“Forest habitats are modified widely in order to enhance wood production, which homogenises and degrades them,” Nirhamo says.
“Especially old and dead trees are abundant and versatile only in forests that have remained unharvested for long, and thus there is little space for species dependent on those resources.”
Laser-based processes for metals are considered to be particularly versatile in industry. Lasers can be used, for example, to precision-weld components together or produce more complex parts using 3D printing – quickly, precisely and automatically. This is why laser processes are used in numerous sectors, such as the automotive and aviation industries, where maximum precision is required, or in medical technology, for example for the production of customized titanium implants.
However, despite their efficiency, laser processes are technically challenging. The complex interactions between the laser and the material make the process sensitive to the smallest of deviations – whether in the material properties or in the settings of the laser parameters. Even minor fluctuations can lead to errors in production.
“To ensure that laser-based processes can be used flexibly and achieve consistent results, we are working on better understanding, monitoring and control of these processes,” says Elia Iseli, research group leader in Empa's Advanced Materials Processing laboratory in Thun. In line with these principles, Giulio Masinelli and Chang Rajani, two researchers from his team, want to make laser-based manufacturing techniques more affordable, more efficient and more accessible – using machine learning.
Vaporize or melt?
First, the two researchers focused on additive manufacturing, i.e. the 3D printing of metals using lasers. This process, known as powder bed fusion (PBF), works slightly differently to conventional 3D printing. Thin layers of metal powder are melted by the laser in exactly the right spots so that the final component is gradually “welded” out of them.
PBF allows the creation of complex geometries that are hardly possible with other processes. Before production can begin, however, a complex series of preliminary tests is almost always required. This is because there are basically two modes for laser processing of metal, including PBF: In conduction mode, the metal is simply melted. In keyhole mode, it is even vaporized in some instances. The slower conduction mode is ideal for thin and very precise components. Keyhole mode is slightly less precise, but much faster and suitable for thicker workpieces.
Where exactly the boundary between these two modes lies depends on a variety of parameters. The right settings are needed for the best quality of the final product – and these vary greatly depending on the material being processed. “Even a new batch of the same starting powder can require completely different settings,” says Masinelli.
Better quality with fewer experiments
Normally, a series of experiments must be carried out before each batch to determine the optimum settings for parameters such as scanning speed and laser power for the respective component. This requires a lot of material and must be supervised by an expert. “That is why many companies cannot afford PBF in the first place,” says Masinelli.
Masinelli and Rajani have now optimized these experiments using machine learning and data from optical sensors that are already incorporated in the laser machines. The researchers “taught” their algorithm to “see” which welding mode the laser is currently in during a test run using this optical data. Based on this, the algorithm determines the settings for the next test. This reduces the number of preliminary experiments required by around two thirds – while maintaining the quality of the product.
“We hope that our algorithm will enable non-experts to use PBF devices,” summarizes Masinelli. All it would take for the algorithm to be used in industry is integration into the firmware of the laser welding machines by the device manufacturers.
Real-time optimization
PBF is not the only laser process that can be optimized using machine learning. In another project, Rajani and Masinelli focused on laser welding – but went one step further. They not only optimized the preliminary experiments, but also the welding process itself. Even with the ideal settings, laser welding can be unpredictable, for example if the laser beam hits tiny defects on the surface of the metal.
“It is currently not possible to influence the welding process in real time,” says Chang Rajani. “This is beyond the capabilities of human experts.” The speed at which the data have to be evaluated and decisions to be made is a challenge even for computers. This is why Rajani and Masinelli used a special type of computer chip for this task, a so-called field-programmable gate array (FPGA). “With FPGAs, we know exactly when they will execute a command and how long the execution will take – which is not the case with a conventional PC,” explains Masinelli.
Nevertheless, the FPGA in their system is also linked to a PC, which serves as a kind of “backup brain”. While the specialized chip is busy observing and controlling the laser parameters, the algorithm on the PC learns from this data. “If we are satisfied with the performance of the algorithm in the virtual environment on the PC, we can 'transfer' it to the FPGA and make the chip more intelligent all at once,” explains Masinelli.
The two Empa researchers are convinced that machine learning and artificial intelligence can contribute a great deal more in the field of laser processing of metals. That is why they are continuing to develop their algorithms and models and are expanding their area of application – in collaboration with partners from research and industry.
Empa researchers Giulio Masinelli (left) and Chang Rajani want to make laser-based processes for metals more accessible.
Before and after: Above a laser weld seam before “learning”, below a weld seam from the process optimized by the algorithm.
A systematic literature review was performed for the use of GIS technologies in managing public built assets. High level trend analysis was complemented with in depth study of 62 research works divided into 11 asset classes. An NLP based tool is also presented to address some of the shortcomings of keyword co-occurrence analysis. The tool analyzes keywords based on research topics and presents potential research gaps for each given topic. The method can also use a larger amount of keyword terms without resulting in cluttered graphs. There are future research opportunities for water Resources, risk assessment, physical environment, growth forecasting, and imaging and remote sensing. There are also needs for improved visualization, and unified software solutions which include data integration such as GIS and BIM. More studies need to be done on specific asset types in order to evaluate impact and address implementation challenges. Technology readiness needs to be improved to facilitate adoption among practitioners in industry and government.
What is GIS? ... A geographic information system (GIS) is a framework for gathering, managing, and analyzing data. Rooted in the science of geography, GIS ...
----
Researchers have performed an in-depth literature review about the application of GIS technologies to public asset management. The review findings are categorized by asset type and research trend analysis is performed revealing many useful applications and showing promising gaps for future research. There are significant improvements that can be made in the field including getting better data integration, performing more risk assessment using GIS technologies, and creating and improved decision support systems to take decision making from reactive to proactive. Furthermore, more research can be done on specific individual asset types. The researchers also present a software tool to improve keyword-based gap analysis which is based on Natural Language Processing. The tool allows the clustering of keyword research terms from a body of literature to be grouped by theme or focus area and it reveals potential research gaps within each given theme.
Asset management is the longest and most costly part of any built asset’s life cycle. Municipalities face challenges in managing often quite large built asset portfolios with limited budgets. The spatial component of many assets often has a significant impact on cost, as well as on asset health and usage. This component is difficult to grasp and can be quite data heavy. GIS technology can solve many spatial challenges related to asset management such as site assessment, monitoring, prioritization of rehabilitation, risk assessment, inventory, and cost and emissions analyses. GIS can also be used to visualize data and facilitate communication and understanding.
A systematic literature review was performed on GIS and public asset management to aid municipalities, researchers, and practitioners, navigate the research applications of GIS technologies across different asset classes and to identify opportunities for future technological development and collaboration. The systematic literature review was performed using the query
municipal OR public “GIS” “asset management” AND PUBYEAR > 2013 AND PUBYEAR < 2025 AND (LIMIT-TO (SUBJAREA, “ENGI”) OR LIMIT-TO (SUBJAREA, “COMP”))
The full corpus of selected literature was used to perform keyword trend analysis and 62 works were selected for in depth study.
The keyword trend analysis was done using Vosviewer[1] which can build keyword co-occurrence networks for visualization. The trend analysis also incorporated this search query:
“GIS” AND “CIVIL ENGINEERING” AND PUBYEAR > 2009 AND PUBYEAR < 2025 AND (LIMIT-TO (SUBJAREA, “ENGI”) OR LIMIT-TO (SUBJAREA, “COMP”)) AND (LIMIT-TO (DOCTYPE, “cp”) OR LIMIT-TO (DOCTYPE, “ar”))
which represents GIS and Civil Engineering research, and this search query:
municipal OR public “asset management” AND PUBYEAR > 2009 AND PUBYEAR < 2025 AND (LIMIT-TO (SUBJAREA, “ENGI”) OR LIMIT-TO (SUBJAREA, "“COMP”)) AND (LIMIT-TO (DOCTYPE, “cp”) OR LIMIT-TO (DOCTYPE, “ar”))
which resents Public Asset Management. These search queries were incorporated for comparison.
The trend analysis revealed that individual asset type research is becoming less popular and that information technology related research is becoming more popular. However practical understanding of digital technologies requires further growth.
There is a gap in research for remote sensing, land use, urban growth, and forecasting. There is also opportunity to go further in risk assessment research to focus on specific types of risk such as floods, landslides, earthquakes, and natural disasters. A gap also exists in using Internet of Things approaches for GIS and public asset management.
While these trends are informative, the keyword co-occurrence analysis was limited to a high level because using more than several dozen terms quickly leads to cluttered and difficult to decipher graphs. A tool was built by the researchers to allow more in-depth gap analysis based on keywords. The tool uses natural language processing (NLP) to group similar keywords together and create clusters around different research topics. This tool shows where potential research gaps lie within given research topics of a research field. It also allows the retention of many more keyword terms without cluttering the produced graphs. This is because each research topic cluster is presented as a separate graph as opposed to the Vosviewer approach of presenting all clusters within one graph. This NLP based tool produces better graphs on specific research topics and it uses more keyword terms allowing for more in-depth gap analysis.
The presented NLP based gap analysis tool was used on the GIS and public asset management research query; with the GIS and Civil Engineering query being used as a reference of a broader research field of which GIS and asset management is a sub field.
The major areas of opportunity were found to be in AI methods, specifically CNN to process the image like map data that is produced by GIS. Additional opportunities are in Water Resources, Physical Environment, Risk Assessment, and Remote sensing. In this analysis remote sensing also includes imaging. The tool produced graphs which show applied and yet to be applied approaches and techniques for each research topic.
As part of the systematic literature review; 62 works were selected for in depth study. The works were grouped into asset categories. The categories are Roads and Highways, Bridges, Railway, Electrical and Gas Utilities, Infrastructure and Urban Planning, Heritage, Public Buildings, Hospitals Airports and Universities, Water Distribution Systems (WDS), Sewer, and Dams.
A summary of each research work is presented, highlighting methods and approaches and applications. These results are useful for practitioners, researchers, and public authorities, to navigate the research field and to select approaches for implementation and further study.
The in-depth study revealed further research gaps. There is a trend for methods to be applied to managing existing assets as opposed to planning new assets. Linear assets such as roads are more often studied than point assets such as airports. A larger variety of asset types could be studied. Within each asset category there are variations in asset type such as different types of hospitals, and different types of terrain. These variations could be further studied to understand specific effects and impacts of GIS technologies. GIS can be further explored for site analysis, monitoring and cost reductions. There is also continuing improvement in visualization technologies, and any further development would be welcome. Computation of data is expensive and calculations on large volumes of data is difficult to realize. 3D modeling is very interesting but remains computationally expensive. There is also a need for unified software solutions instead of patchwork pipelines. This includes GIS-BIM integration, but also includes connecting other types of data sources such as seismic, hydraulic, and electrical.
There is a lack of publicly available datasets for GIS and Public Asset Management. Data is difficult and costly to collect and this leads it to not being shared with the research community. This results in the field is being quite siloed. More publicly available datasets are needed to evaluate performance of different methods, set performance benchmarks, and increase collaboration. Publicly available datasets will also help with replicability of results. Due to the siloed nature of the research field, where authors of works only publish with other authors of those same works, any significant collaboration between multiple research groups has potential to be significant breakout research.
There is also considerable interest in 3D modelling and BIM. Methods are emerging that could be further developed from 3D model to fully fledged digital twin. SAR and Point Clouds are sensing and visualization technologies that could be further leveraged across more applications. AI related research is also considerably prevalent.
A wide variety of methods and applications are emerging in the research field. The overall technology readiness level needs to be driven further in order to facilitate adoption among practitioners. More practical case studies and implementation of all of the planned components in a research plan will build confidence and lead to these technologies being used for asset management projects and operations.
The literature review concludes with an overview of practical applications of GIS for public asset management. This is done to aid public authorities in selecting GIS technologies for their asset management needs. Broad level highlights of general applications and their benefits are presented. This includes condition assessment, visualization and planning, and enabling digital twin.
A systematic literature review was performed to aid researchers, practitioners, and public authorities, navigate the field of GIS for public asset management. Broad level trend analysis was complemented with in-depth study that presents approaches and applications across specific asset types. An NLP AI powered tool is also presented to improve broad level gap analysis which identifies research gaps per research topic within a research field. A summary of applications geared toward public authorities is also presented to highlight broad technology applications across different types of assets.
This paper ”GIS based solutions for management of public building and infrastructure assets: a review of state of the art and research trend analysis” was published in Smart Construction.
Popov P, Mozaffari MH, RazaviAlavi S, Jalaei F. GIS based solutions for management of public building and infrastructure assets: a review of state of the art and research trend analysis. Smart Constr. 2025(2):0009, https://doi.org/10.55092/sc20250009.
A geographic information system (GIS) consists of integrated computer hardware and software that store, manage, analyze, edit, output, and visualize ...