Where’s the Flood?: Real-time flood risk visualization via server-based MR enhances accessibility and public safety
A research group at Osaka University developed a mixed reality (MR) system to intuitively share flooding forecasts. This system can be visualized on a mobile device's web browser through Internet communication and server rendering. It allows wider populations to view dynamic flood forecasts and can increase risk preparedness.
Osaka, Japan – Climate change is increasing flood risks in urban areas, with heavy rainfall disasters now becoming a global problem. Numerical simulations predict flooding from heavy rain and river overflows, usually displaying results on flat maps. However, map scale can limit detailed risk assessment, making it difficult for residents to fully understand flood risks. Now, researchers from Osaka University have developed a mobile mixed reality (MR) system as a powerful tool for real-time flood risk visualization through server-based rendering and web-based access. This allows urban populations to view dynamic flood forecasts on their mobile devices, enhancing community preparedness and response.
By offloading the computational workload to a server, MR displays can be rendered efficiently, allowing real-time visualization on commonly-used mobile devices such as smartphones.
Access to these MR displays is streamlined through web browsers, eliminating the need for specialized applications. This accessibility means that multiple devices can connect simultaneously, enabling widespread participation in MR visualizations. The system intelligently assigns the optimal server from a pool of rendering servers, ensuring efficient performance and scalability. This approach democratizes access to advanced flood forecasting tools, making it feasible for many users to engage with MR visualizations without the constraints of dedicated software.
"As climate change heightens flood risks, mitigating these risks is crucial," says lead author of the study Ryoma Tsujimoto. "We expect that our study will help people to intuitively understand flooding risks, regardless of their expertise, and that eventually, social implementation of these research results will improve people's safety and promote and industrialize DX (Digital Transformation) in the built environment field."
Example of a 3D urban flood model. This model reproduces flooding within an area of 2 km from north to south and 2.5 km from east to west. Not only the depth of flooding, but also the velocity of the floodwaters can be represented by color changes.
CREDIT
🄫2024 Ryoma Tsujimoto et al., Environmental Modelling & Softwar
The article, “Server-enabled mixed reality for flood risk communication: On-site visualization with digital twins and multi-client support,” was published in Environmental Modelling & Software at DOI: https://doi.org/10.1016/j.envsoft.2024.106054
About Osaka University
Osaka University was founded in 1931 as one of the seven imperial universities of Japan and is now one of Japan's leading comprehensive universities with a broad disciplinary spectrum. This strength is coupled with a singular drive for innovation that extends throughout the scientific process, from fundamental research to the creation of applied technology with positive economic impacts. Its commitment to innovation has been recognized in Japan and around the world. Now, Osaka University is leveraging its role as a Designated National University Corporation selected by the Ministry of Education, Culture, Sports, Science and Technology to contribute to innovation for human welfare, sustainable development of society, and social transformation.
Website: https://resou.osaka-u.ac.jp/en
JOURNAL
Environmental Modelling & Software
METHOD OF RESEARCH
Computational simulation/modeling
SUBJECT OF RESEARCH
Not applicable
ARTICLE TITLE
Server-enabled mixed reality for flood risk communication: On-site visualization with digital twins and multi-client support
Best of both worlds: Innovative positioning system enhances versatility and accuracy of drone-viewpoint mixed reality applications
OSAKA UNIVERSITY
Osaka, Japan – A research group at Osaka University has developed an innovative positioning system, correctly aligning the coordinates of the real and virtual worlds without the need to define routes in advance. This is achieved by integrating two vision-based self-location estimation methods: visual positioning systems (VPS) and natural feature-based tracking. This development will lead to the realization of versatile drone-based mixed reality (MR) using drones available on the market. Drone-based MR is expected to see use in a variety of applications in the future, such as urban landscape simulation and support for maintenance and inspection work, contributing to further development of drone applications, especially in the fields of architecture, engineering, and construction (AEC).
In recent years, there has been a growing interest in the integration of drones across diverse sectors, particularly within AEC. The use of drones in AEC has expanded due to their superior features in terms of time, accuracy, safety, and cost. The amalgamation of drones with MR stands out as a promising avenue as it is not restricted by the user's range of action and is effective when performing landscape simulations for large-scale spaces such as cities and buildings. Previous studies proposed methods to integrate MR and commercial drones using versatile technologies such as screen sharing and streaming delivery; however, these methods required predefined drone flight routes to match the movements of the real and virtual world, thus reducing the versatility of the application and limiting use cases of MR.
While this research does not implement a drone-based MR application for actual use, the proposed alignment system is highly versatile and has the potential for various additional functionalities in the future. This brings us one step closer to realizing drone-centric MR applications that can be utilized throughout the entire lifecycle of architectural projects, from the initial stages of design and planning to later stages such as maintenance and inspection.
First author Airi Kinoshita mentions, “The integration of drones and MR has the potential to solve various social issues, such as those in urban planning and infrastructure development and maintenance, disaster response and humanitarian aid, cultural protection and tourism, and environmental conservation by freeing MR users from the constraints of experiencing only their immediate vicinity, enabling MR expression from a freer perspective.”
A comparison of the positioning accuracy between the proposed system of this study and the system of the previous study. Compared to the results of the previous study (right column), it can be read that the positioning accuracy of the proposed system in this study is higher.
CREDIT
🄫2023 Airi Kinoshita et al., Drone Systems and Applications
The article, “Drone-Based Mixed Reality: Enhancing Visualization for Large-Scale Outdoor Simulations with Dynamic Viewpoint Adaptation Using Vision-Based Pose Estimation Methods,” was published in Drone Systems and Applications at DOI: https://doi.org/10.1139/dsa-2023-0135
About Osaka University
Osaka University was founded in 1931 as one of the seven imperial universities of Japan and is now one of Japan's leading comprehensive universities with a broad disciplinary spectrum. This strength is coupled with a singular drive for innovation that extends throughout the scientific process, from fundamental research to the creation of applied technology with positive economic impacts. Its commitment to innovation has been recognized in Japan and around the world. Now, Osaka University is leveraging its role as a Designated National University Corporation selected by the Ministry of Education, Culture, Sports, Science and Technology to contribute to innovation for human welfare, sustainable development of society, and social transformation.
Website: https://resou.osaka-u.ac.jp/en
JOURNAL
Drone Systems and Applications
METHOD OF RESEARCH
Computational simulation/modeling
SUBJECT OF RESEARCH
Not applicable
ARTICLE TITLE
Drone-Based Mixed Reality: Enhancing Visualization for Large-Scale Outdoor Simulations with Dynamic Viewpoint Adaptation Using Vision-Based Pose Estimation Methods
No comments:
Post a Comment