We conduct basic and applied research related to systems that make intuitive sense of the world around us. From the development of novel deep-learning neural architectures for data-driven decision making, to the invention of new algorithms that take advantage of revolutionary hardware such as quantum and neuromorphic circuitry, we seek to change the world by pushing the boundary of what is possible in the here and now.
Information Sciences
Uncovering actionable knowledge and generating insight into exascale datasets from heterogeneous sources in real time
Developing methods and tools for understanding complex interactions and extracting actionable information from massive data streams.
Teams
Data Science at Scale
Extremely large datasets and extremely high-rate data streams are becoming increasingly common due to the operation of Moore’s Law as applied to sensors, embedded computing, and traditional high-performance computing. Interactive analysis of these datasets is widely recognized as a new frontier at the interface of information science, mathematics, computer science, and computer engineering. Text searching on the web is an obvious example of a large dataset analysis problem; however, scientific and national security applications require far more sophisticated interactions with data than text searches. These applications represent the ‘data to knowledge’ challenge posed by extreme-scale datasets in, for example, astrophysics, biology, climate modeling, cyber security, earth sciences, energy security, materials science, nuclear and particle physics, smart networks, and situational awareness. In order to contribute effectively to Los Alamos National Lab’s overall national security mission, we need a strong capability in Data Science at Scale. This capability rests on robust and integrated efforts in data management and infrastructure, visualization and analysis, high-performance computational statistics, machine learning, uncertainty quantification, and information exploitation. The Data Science at Scale capability provides tools capable of making quantifiably accurate predictions for complex problems with the efficient use and collection of data and computing resources.
Capabilities:
- R&D for analysis and visualization for extreme-scale scientific datasets
- Extreme-scale data analysis and visualization
- Ensemble analysis
- GPU-based algorithm optimization, domain-specific data compression
- Virtual environments for scientific data analysis
- R&D in both in situ and post-processing analysis and vis for extreme-scale data
- Supports both hardware and software that makes extreme-scale visualization possible at LANL
Artificial Intelligence
Developing computational methods that “learn by example” through the development of mathematical models, using past observations to make predictions about future data. Our research team is at the forefront of developing new methodologies for data-driven decision making, and applying these techniques to a wide variety of real-world applications.
Capabilities:
- AI/ML methods & implementations
- Uncertainty quantification and robustness
- Physics-informed machine learning
- Graph-based machine learning
- Computer vision, image analysis, video analytics
- Signal analysis
- Natural language processing, text analysis & information extraction
- Applied statistics and Bayesian statistics
- Optimization
- Ontologies and knowledge graphs
- Cheminformatics and atomistic modeling
Real world applications:
- Inference, forecasting, and decision making
- Molecular dynamics simulations
- Chemical design and search
- Genomics, epigenetics, and phylogenetics
- Remote sensing
- X-ray diffraction
- Dynamical systems, computational fluid dynamics
- Inverse problems, computational tomography
- Automated chrestomathy
- Document corpus analysis
Novel computing
Algorithm development for non-traditional computational models and systems. We have active research programs related to quantum computing, neuromorphic computing, and developing custom ASICs to serve as coprocessors to accelerate critical computations.
- Quantum computing - Quantum computers promise to deliver exponential speedups for a set of problems of interest in physics and computing. Our research in quantum computing covers a very broad range of topics, including quantum algorithms, quantum sensing, computational complexity, both near-term and fault-tolerant quantum computing. We work with collaborators in theory and experiment to push this field forward.
- Neuromorphic computing (Low power ML) - Neuromorphic computers are based on the architecture of the brain: simple computational elements (neurons) connected by a massively parallel communications network. These computers have been shown to provide more than two order-of-magnitude reductions in energy use. For this reason, our research in neuromorphic computing focuses on neuromorphic machine learning algorithms for AI. AI has a major impact on climate change, with recent results showing that 5 car-lifetimes worth of carbon is typically used to train a single car-driving AI.
- Custom coprocessors (ASICs, FPGAs, VHDL) - Modern computational systems, for instance your phone, are typically constructed of a large number of optimized coprocessors. Our work in this area focuses on coprocessors of use to the LANL mission, including processors for physical simulations.
Mission Integrative Computer Science
The Mission Integrative team spans cutting-edge computer science research, FPGA design, signal processing, modern software engineering, integration of new technologies, and support of hardware-software co-design. Our team bridges the gap between the laboratory's computer science research activities and the custom hardware application needs of NNSA Defense Programs. Working as the bridge between Computer Science and Global Security we develop new technologies for custom hardware systems, particularly for space applications. Through partnerships with mission application teams we facilitate rapid adoption of modern and cutting edge hardware and software practices and technologies to improve the performance, flexibility and reliability of new systems and missions.
Resources/Software
- Sequedex (DNA Sequence Classification)
- Simian (Just in Time Compiled Parallel Discrete-Event Simulator)
- SimX (Python-based engine for Parallel Discrete-Event Simulation)
- PetaVision (C++ library for designing and deploying large-scale neurally inspired computational models)
- LUNUS (Software for diffuse X-ray scattering from protein crystals)
- PPT (Performance Prediction Toolkit)
- hippynn : a python library for atomistic machine learning
- minervachem : a python library for cheminformatics and machine learning