School of Computing and Information Sciences @ FIU
^ Back to Top
Videos, Photos And PowerPoint slides Are Now Avaliable
On The Future of High Performance Computing: How To Think for Peta and Exascale ComputingJack DongarraMember of National Academy of EngineeringUniversity of TennesseeThursday, February 2nd, 2012 at 2:30pmGreen Library 100
In this talk we examine how high performance computing has changed over the last 10-year and look toward the future in terms of
trends. These changes have had and will continue to have a major impact on our software. Some of the software and algorithm
challenges have already been encountered, such as management of communication and memory hierarchies through a combination of
compile--time and run--time techniques, but the increased scale of
computation, depth of memory hierarchies, range of latencies, and increased run--time environment variability will make these problems much harder.
We will look at five areas of research that will have an importance impact in the development of software and
We will focus on following themes:
. Redesign of software to fit multicore and hybrid architectures
. Automatically tuned application software
. Exploiting mixed precision for performance
. The importance of fault tolerance
. Communication avoiding algorithms
Jack Dongarra is a University Distinguished Professor of Computer Science at the University of Tennessee and a
Distinguished Research Staff at Oak Ridge National Laboratory (ORNL). He is the director of the Innovative
Computing Laboratory at the UT which has a staff of 50 people doing research in the area of high performance
computing. He is also the director of the Center for Information Technology Research at the UT. He has contributed
to the design and implementation of the following open source software packages and systems: EISPACK, LINPACK, the
BLAS, LAPACK, ScaLAPACK, Netlib, PVM, MPI, NetSolve, Top500, ATLAS, and PAPI. He has published
approximately 200 articles, papers, reports and technical memoranda and he is coauthor of several books. He was awarded
the IEEE Sid Fernbach Award in 2004 for his contributions in the application of high performance computers using
innovative approaches. He is a Fellow of the AAAS, ACM, the IEEE and a member of the National Academy of Engineering.
And Logic Begat Computer Science: When Giants Roamed the EarthMoshe Y. VardiMember of National Academy of EngineeringRice UniversityFriday, February 24th, 2012 at 2pmPG5 Market Station Room 134
During the past fifty years there has been extensive, continuous and growing interaction between logic and computer science. In fact,
logic has been called .the calculus of the computer science.. The argument is the logic plays a fundamental role in computer science, similar
to that played by calculus in the physical sciences and traditional engineering disciplines. Indeed, logic plays an important role in areas of
computer science as disparate as architecture (logic gates), software engineering (specification and verification), programming languages
(semantics, logic programming), databases (relational algebra and SQL) artificial intelligence (automated theorem proving), algorithms
(complexity and expressiveness) and theory of computation (general notions of computability). This non-technical talk will provide an overview
of the unusual effectiveness of logic in computer of science, going back all the way to Aristotle and Euclid and showing how logic actually gave
rise to computer science.
Moshe Y. Vardi is the Karen Ostrum George Distinguished Service Professor of Computational Engineering and Director of the Ken Kennedy Institute for Information Technology at Rice University.
He chaired the Computer Science Department at Rice University from January 1994 till June 2002. Prior to joining Rice in 1993, he was at the IBM Almaden Research Center, where he managed the Mathematics
and Related Computer Science Department. His research interests include database systems, computational-complexity theory, multi-agent systems, and design specification and verification. Vardi received
his Ph.D. from the Hebrew University of Jerusalem in 1981. He is the author and co-author of about 400 articles, as well as two books, .Reasoning about Knowledge. and .Finite Model Theory and Its
Applications., and the editor of several collections.
Vardi is the recipient of numerous awards, including three IBM Outstanding Innovation Awards, the 2000 Goedel Prize, the 2005 ACM Kanellakis Award for Theory and Practice, the 2006 LICS Test-of-Time Award,
the 2008 ACM PODS Mendelzon Test-of-Time Award, the 2008 ACM SIGMOD Codd Innovations Award, the 2008 Blaise Pascal Medal for Computer Science by the European Academy of Sciences, the 2008 ACM Presidential
Award, the 2010 CRA Distinguished Service Award, the 2010 ACM Outstanding Contribution Award, and the 2011 IEEE Computer Society Harry H. Goode Award. He holds honorary doctorates from the University of
Saarland, Germany, and the University of Orleans, France. Vardi is an editor of several international journals, and Editor-in-Chief of the Communication of ACM. He is Guggenheim Fellow, as well as a Fellow
of the Association of Computing Machinery, the American Association for the Advancement of Science, the Association for the Advancement of Artificial Intelligence, and the Institute for Electrical and Electronic
Engineers. He was designated Highly Cited Researcher by the Institute for Scientific Information, and was elected as a member of the US National Academy of Engineering, the American Academy of Arts and Science,
the European Academy of Sciences, and the Academia Europea.
High Performance Computing and Computational Science ApplicationsMary Fanett WheelerMember of National Academy of EngineeringUniversity of Texas, AustinFriday, March 2nd, 2012 at 2pmPG5 Market Station Room 134
The research of the Center for Subsurface Modeling (CSM) addresses the growing use of computers to simulate physical events and the use of these simulations
to study physical
phenomena and to perform engineering analysis and design. Our team investigates high-performance parallel processing as a tool to model the behavior of fluids in
permeable geologic formations
such as petroleum and natural gas reservoirs, groundwater aquifers and aquitards, and in shallow bodies of water such as bays and estuaries.
The accurate and efficient simulation of subsurface phenomena requires a blend of physical and geomechanical modeling of subsurface processes and careful numerical implementation.
Compounding these issues is a general lack of high quality data from model calibration and verification. CSM researchers collaborate with outside experts to find suitably
of physical systems, including such processes as fluid phase behavior, particle transport and dispersion, capillary pressure effects, flow in highly heterogeneous media
(possibly fractured and vuggy),
geomechanical response and subsidence and well production. These and other processes must be simulated accurately so as to avoid nonphysical numerical artifacts that can cloud
engineering judgment regarding
risk assessment and the intervention and optimization of management objectives.
The Center is part of the Institute for Computational Engineering and Sciences (ICES). CSM comprises a close-knit team of faculty and research scientists with expertise in applied
mathematics, engineering, and computer, physical, chemical and geological sciences. This interdisciplinary approach to simulation permits a more effective integration of advanced mathematical and numerical
techniques with engineering applications.
Mary F. Wheeler received her bachelor's and master's degrees in Mathematics from the University of Texas at Austin, and, in 1971, she received a Ph.D. in Mathematics from Rice University.
She has served on the faculty of the University of Texas at Austin since 1995 and is presently Director of the Center for Subsurface Modeling in the Institute for Computational Engineering and Sciences.
Her research interests include the numerical solution of partial differential systems with applications to the modeling of subsurface and surface flows and parallel computation. Her numerical work includes the
formulation, analysis and implementation of finite-difference/finite-element discretization schemes for nonlinear, coupled PDE’s as well as domain decomposition iterative solution methods.
She was recently granted the
2011 Humboldt Research Award. This award is given by the Alexander von Humboldt Foundation to internationally renowned scientists and scholars.
Simulating Quantum CollisionsVince McKoyCalifornia Institute of TechnologyFriday, March 23rd, 2012 at 2pmPG5 Market Station Room 134
Experiments have shown that even very slow electrons can break DNA strands. This surprising result suggests that the electrons are somehow trapped in states that
live long enough to promote bond rupture. Computational techniques can simulate such trapping processes and help develop an understanding of the strand-break mechanisms.
However, detailed quantum simulations of the collision between a slow electron and a polyatomic molecule are highly computationally intensive. I will discuss the techniques
we use to address this demanding problem and give a few examples of applications and their implications.
Professor McKoy is a world renowned scientist in the area of Theoretical Chemistry. After receiving his Ph.D. in Chemistry from Yale University in 1964 became
faculty at California Institute of Technology. His research interests focus on the interactions of slow electrons with large molecules, and
especially biomolecules such as the bases and other
constituents of DNA. He uses high-level computational methods and high performance computing to help answer these and related questions in
electron-molecule dynamics. A key feature of this effort
has been the development of highly scalable strategies and algorithms which make it possible to exploit large parallel systems in such studies.
He is a Fellow of the American Physical Society and is listed in Who’s
Who in America and in American Men of Science. He has been a Fellow of the Alfred P. Sloan and Guggenheim Memorial Foundations and received the
Governor-General’s (of Canada) Medal for Academic Excellence (1960).
The Data and Compute-Driven Transformation of Modern ScienceEdward SeidelIEEE Sidney Fernbach AwardeeNational Science Foundation Assistant DirectorMathematical and Physical SciencesWednesday, March 28th, 2012 at 2pmSIPA-125
"We all know that modern science is undergoing a profound transformation
as it aims to tackle the complex problems of the 21st Century. It is
becoming highly collaborative; problems as diverse as climate change,
renewable energy, or the origin of gamma-ray bursts require understanding
processes that no single group or community alone has the skills to
address. At the same time, after centuries of little change, compute,
data, and network environments have grown by 9-12 orders of magnitude in
the last few decades. Moreover, science is not only compute-intensive but
is dominated now by data-intensive methods. This dramatic change in the
culture and methodology of science will require a much more integrated and
comprehensive approach to development and deployment of hardware,
software, and algorithmic tools and environments supporting research,
education, and increasingly collaboration across disciplines."