Keynote Lectures

ICCS is well known for its line-up of keynote speakers.
This page will be frequently updated with new names, lecture titles and abstracts.

CONFIRMED SPEAKERS

David Abramson
Director, Research Computing Centre, Professor of Computer Science, FACM, FIEEE, FTSE, FACS, The University of Queensland
Australia
        Translational Computer Science

Manuel Castro Díaz
University of Málaga
Spain
        GPU Accelerated Shallow-Water Type Models for Tsunami Modelling

Alfons Hoekstra
University of Amsterdam
The Netherlands
        Towards Digital Twins in Healthcare (and some reflections on Computational Science)

Jiří Mikyška
Czech Technical University in Prague, Faculty of Nuclear Science and Physical Engineering, Department of Mathematics – Mathematical Modelling Group
Czechia
        Computational Methods in Thermodynamics of Multicomponent Mixtures with Applications in Compositional Simulation

Takemasa Miyoshi
Team Leader and Chief Scientist, RIKEN
Japan
        Big Data Assimilation Revolutionizing Numerical Weather Prediction Using Fugaku

Coral Calero Muñoz
University of Castilla-La Mancha
Spain
        Software: to green or not to green, that’s the question

Translational Computer Science
David Abramson
David Abramson
The University of Queensland, Australia
WEB

 

David is a Professor of Computer Science, and currently heads the University of Queensland Research Computing Centre.
He has been involved in computer architecture and high performance computing research since 1979.
He has held appointments at Griffith University, CSIRO, RMIT and Monash University.
Prior to joining UQ, he was the Director of the Monash e-Education Centre, Science Director of the Monash e-Research Centre, and a Professor of Computer Science in the Faculty of Information Technology at Monash.
From 2007 to 2011 he was an Australian Research Council Professorial Fellow.
David has expertise in High Performance Computing, distributed and parallel computing, computer architecture and software engineering.
He has produced in excess of 230 research publications, and some of his work has also been integrated in commercial products. One of these, Nimrod, has been used widely in research and academia globally, and is also available as a commercial product, called EnFuzion, from Axceleon.
His world-leading work in parallel debugging is sold and marketed by Cray Inc, one of the world’s leading supercomputing vendors, as a product called ccdb.
David is a Fellow of the Association for Computing Machinery (ACM), the Institute of Electrical and Electronic Engineers (IEEE), the Australian Academy of Technology and Engineering (ATSE), and the Australian Computer Society (ACS).
His hobbies include recreational cycling, photography and making stained glass windows. He is also an amateur playwright, and author of Purely Academic.

ABSTRACT
Given the increasingly pervasive role and growing importance of computing and data in all aspects of science and society fundamental advances in computer science and their translation to the real world have become essential. Consequently, there may be benefits to formalizing Translational Computer Science (TCS) to complement the traditional foundational and applied modes of computer science research, as has been done for translational medicine. TCS has the potential to accelerate the impact of computer science research overall. In this talk I discuss the attributes of TCS, and formally define it. I enumerate a number of roadblocks that have limited its adoption to date and sketch a path forward. Finally, I will provide some specific examples of translational research underpinning computational science projects and illustrate the advantages to both computer science and the application domains.
GPU Accelerated Shallow-Water Type Models for Tsunami Modelling
Manuel Castro Diaz
Manuel Castro Díaz
University of Málaga, Spain
WEB

 

Manuel Castro Díaz is Full Professor at the University of Málaga and a member of the EDANYA group (Differential Equations Numerical Analysis and Applications), focusing his research on the numerical analysis of non-conservative hyperbolic systems and the modeling of geophysical flows. Specifically, he has been working on the development and analysis of numerical schemes for those systems in the framework of ‘path-conservative’ finite volume method, with applications to the simulation of coastal currents, floods, avalanches, sediment transport, turbidity currents, generation and propagation of tsunamis, biphasic flows, magnetohydrodynamics, etc … He has also been interested in the efficient implementation of the numerical models using multicore and multiGPU architectures and, more recently, on uncertainty quantification, data assimilation, and on the application of deep neural networks in the framework of tsunami modelling. He has published more than 100 works, most of them in top rated journals on Numerical Analysis, Computer Science and Geophysical applications. He was awarded with the J.Lions Lions Award to Young Scientists in Computational Mathematics, awarded by ECCOMAS in 2008, and in 2018, his group EDANYA was awarded with the Nvidia Global Impact Award 2018, for its contributions in tsunami modeling, and in particular for the numerical development of Tsunami-HySEA model, that has been incorporated in the core of several national tsunami early warning systems. Finally, his name has been included in the Stanford world’s top 2% Scientist 2021 and he was honored to be lecturer at the ICM 2018 in the Numerical Analysis and Scientific Computing section.

ABSTRACT
In this talk we present a family of models for the simulation of earthquake, or landslides generated tsunamis. All of them fit in the framework of shallow-flows. Here, the flow is supposed to be modeled by shallow-water type systems composed by one or several layers of fluid. Multilayer shallow-water models allow us to recover the vertical profile of the velocities, that may be relevant at the early stages of the landslide-fluid interaction, as well as non-hydrostatic corrections. Earthquake generated tsunamis are supposed to be driven either by a simple Okada model or for more sophisticated ones, like SeisSol model. Concerning the evolution of the landslide, either it is considered to be a rigid body and its motion it is supposed to be known, either it is supposed to be modeled by a Savage-Hutter type model. The resulting models are discretized using a high-order finite-volume path-conservative scheme and implemented on a multi-GPU framework. Finally, an exhaustive validation procedure has been carrying out by the comparison with laboratory experiments and real events over real bathymetries.
Towards Digital Twins in Healthcare (and some reflections on Computational Science)
Alfons Hoekstra
Alfons Hoekstra
University of Amsterdam, The Netherlands
WEB

 

Prof. dr. ir. Alfons Hoekstra is a full professor in Computational Science & Engineering at the Computational Science Lab of the Informatics Institute of the University of Amsterdam. His research focusses on the Virtual Human Twin (with applications a.o. in the cardiovascular and cerebrovascular domain), on Multiscale Modeling of Complex Systems, and on High Performance Computing. He is editor of the Journal of Computational Science, member of the Strategy Board for Computational Science NL, and member of the advisory committee on Digitalisation of Research of the Dutch Science Foundation. He served as director of the Informatics Insitute of the University of Amsterdam from 2020 – 2023 and currently serves as scientific director of the technology hub for Molecular and Material Design.

ABSTRACT
Started at NASA as a ‘living model’ to mitigate the Apollo 13 oxygen tank explosion, the concept of a digital twin where a computational model of a physical system takes continues input data from that system, predicting future events based on that data, and if needed intervening with the physical system, has witnessed a strong growth over the last decade. I will report on a vision on Digital Twins in Healthcare (DTH) emerging in Europe, and discuss the example of a DTH for Acute Ischemic Stroke. In doing so I will also reflect on recent trends in Computational Science as triggered o.a. by developments in Digital Twins.
Computational Methods in Thermodynamics of Multicomponent Mixtures with Applications in Compositional Simulation
Jiri Mikyska
Jiří Mikyška
Czech Technical University in Prague, Czechia
WEB

 

Jiří Mikyška is a professor of applied mathematics at the Faculty of Nuclear Sciences and Physical Engineering, Czech Technical University in Prague. The principal focus of our research is on the development of robust and efficient numerical methods for simulation of transport processes in porous media with applications in enhanced oil recovery, hydrogen storage, or CO2 sequestration. Research is carried out in two major directions – Thermodynamics of multicomponent mixtures and compositional simulation. Thermodynamics of multicomponent mixtures involves development of new thermodynamic formulations and new robust and efficient numerical algorithms for phase stability testing of multicomponent mixtures, computation of multi-phase equilibria. The compositional simulation involves development of new formulations of transport models and coupling them to the thermodynamic models. We focus on advanced and higher-order discretization methods (mixed and mixed-hybrid finite element methods, finite volume methods, discontinuous Galerkin methods), on different schemes for coupling the transport and thermodynamic models, and on inclusion of various transport processes into the model (convection, diffusion, gravity, viscosity, capillarity).

ABSTRACT
Phase stability testing and phase equilibrium calculations are central problems of thermodynamics of multicomponent mixtures with many applications in chemical engineering, enhanced oil recovery, or CO2 sequestration. The thermodynamical basis of both problems was formulated by J. W. Gibbs (~1885), and the first computational methods for solving these problems appeared at the very beginning of the digital computers era (Rachford and Rice, 1954). The conventional computational methods used until today originate in early 1980’s. In the conventional approach, the mixture is described by the pressure, temperature, and overall chemical composition (PTN-formulation). The task is to determine whether the mixture remains stable or splits into two (or more) phases. In case of phase splitting, one has to determine the chemical compositions and amounts of the phases. The stable equilibrium state corresponds to the global minimum of the total Gibbs free energy of the system. These methods can be subsequently used as a part of a compositional transport simulation where the phase equilibrium calculation is typically performed at each finite element at every time step, requiring the computational methods to be both robust and efficient. The challenging part is the correct coupling of local thermodynamic equilibrium calculations with the compositional transport model. In this talk, I will show two alternative formulations of the basic problems where the state of the mixture is described either by volume, temperature, and moles (VTN-formulation), or internal energy, volume, and moles (UVN-formulation). Despite of the fact that these formulations were overlooked previously in the literature, I will show that these formulations are in a sense more natural than the conventional approach when one wants to couple them with the transport simulation. I will present a unified approach that allows development of a single solver that can treat all three formulations in a unified way. I will also present examples of compositional simulations and phase equilibrium calculations in various formulations and recent results showing the extension of the conventional approach for calculation of phase equilibria of mixtures splitting up to four phases.
Big Data Assimilation Revolutionizing Numerical Weather Prediction Using Fugaku
Takemasa Miyoshi
Takemasa Miyoshi
Team Leader and Chief Scientist, RIKEN, Japan
WEB

 

Dr. Takemasa Miyoshi received B.S. (2000) from the Kyoto University, and M.S. (2004) and Ph.D. (2005) in meteorology from the University of Maryland (UMD). Dr. Takemasa Miyoshi started his professional career as a civil servant at the Japan Meteorological Agency (JMA) in 2000. He was a tenure-track Assistant Professor at UMD in 2011. Since 2012, Dr. Miyoshi has been leading the Data Assimilation Research Team in RIKEN Center for Computational Science (R-CCS). Dr. Miyoshi's scientific achievements include more than 160 peer-reviewed publications and more than 210 invited presentations. Dr. Miyoshi has been recognized by prestigious awards such as the Meteorological Society of Japan Award (2016), the Yomiuri Gold Medal Prize (2018), and the Commendation by the Prime Minister for Disaster Prevention (2020).

ABSTRACT
At RIKEN, we have been exploring a fusion of big data and big computation in numerical weather prediction (NWP), and now with AI and machine learning (ML). Our group in RIKEN has been pushing the limits of NWP through two orders of magnitude bigger computations using the previous Japan’s flagship “K computer”. The efforts include 100-m mesh, 30-second update “Big Data Assimilation” (BDA) fully exploiting the big data from a novel Phased Array Weather Radar. With the new “Fugaku” since 2021, we achieved a real-time BDA application to predict sudden downpours up to 30 minutes in advance during Tokyo Olympics and Paralympics. This presentation will introduce the most recent results from BDA experiments, followed by perspectives toward DA-AI fusion and expanding new applications beyond meteorology..
Software: to green or not to green, that’s the question
Coral Calero
Coral Calero Muñoz
University of Castilla-La Mancha, Spain
WEB

 

Coral Calero is Professor at the University of Castilla-La Mancha in Spain and has a PhD in Computer Science. She is a member of the Alarcos Research Group, being responsible of the “Green and Sustainable software” line research, where two main lines of work are developed. The first one addresses issues such as measuring the impact that software and information systems have on the environment and how to improve its energy efficiency, as well as human and economic aspects related to software sustainability. The second major line of work supports all the group’s dissemination activities to raise awareness of the impact that software has on the environment.

ABSTRACT
That software moves the world is a clear fact. And that it is becoming more and more important, too. There are three aspects that have led to an increase in the intensity with which software is used: the Internet and social networks, data and artificial intelligence.
However, not everything is positive in the support that software provides to our daily lives. There are estimates that ICT will be responsible for 20% of global energy consumption by 2030, part of which will be due to software. And precisely the three mentioned aspects require large amounts of energy.
In this keynote we will review different concepts related to software sustainability, and we will show some results of software consumption measurements that we have carried out on the one hand, cases carried out to raise awareness in society in general about the impact that software has on the environment. On the other hand, examples related to the consumption of data and artificial intelligence carried out with the aim of creating a set of best practices for the software professionals.
Our ultimate goal is to make you aware of the consumption problem associated with software and to ensure that, if at first, we were concerned with the “what” and later with the “how”, now it is time to focus on the “with what”.