Los Alamos National LaboratoryInformation Science and Technology Institute (ISTI)
Implementing and fostering collaborative research, workforce and program development, and technical exchange

Seminar Series

The IS&T seminars on various information science and technology topics are held every Wednesday from 3-4 PM at the CNLS Conference Room unless otherwise noted.

Contact  

  • Institute Director
  • Stephan Eidenbenz
  • (505) 667-3742
  • Email
  • Institute Deputy Director
  • Kary Myers
  • (505) 606-1455
  • Email
  • Professional Staff Assistant
  • Nickole Aguilar Garcia
  • (505) 665-3048
  • Email

To schedule a speaker please contact Nickole Garcia.

To subscribe to the IS&T seminar announcements please use the Program Announcement tool (use win domain password to login)/ Select the "NSEC - Information Science and Technology Institute" to receive announcements.

August 23, 2017: Nicholas Polys, Virginia Tech

Immersive Analytics: New Approaches to Scaling High-Performance Visualization

Abstract

Several trends are accelerating the flow from the acquisition and analysis of data to the production of human knowledge. Powerful hardware and efficient software are producing large amounts of simulation and sensor data products. Analytic and decision-making environments are fusing more and more data sources and types for human sense-making. In this talk, I will present an approach to scaling High-Performance Visualizations (HPV) for human design, decision-making, and discovery. We consider visual analytics as an optimization problem of information channels between mind and computer, seeking to optimize throughput based on both computational and perceptual/cognitive principles. We reflect on the deployment of HPV display platforms in a university setting and consider the challenges and opportunities for scaling human interaction and perception in immersive analytics.

Host: Curt Canada, 505-665-7453, cvc@lanl.gov

August 2, 2017: George Markowsky, Missouri S&T

The Metric at the End of the Rainbow

Abstract

It is common to see statements such as the following which come from https://www.dhs.gov/science-and-technology/csd-elsmu. Defining effective information security metrics has proven difficult, even though there is general agreement that such metrics could allow measurement of progress in security measures and, at a minimum, rough comparisons of security between systems. . . . However, general community agreement on meaningful metrics has been hard to achieve. This is due in part to the rapid evolution of IT, as well as the shifting focus of adversarial action. This page neglects to state the real reason that agreement on meaningful metrics has been hard to achieve: it is not possible to construct a reasonable metric. This paper, which is based on results that have been known for a long time will demonstrate that under reasonable requirements for a metric, it is not possible to construct a metric. Hence, searching for such a metric is like searching for a pot of gold at the end of the rainbow.

Host: Chris Rawlings/Daniel Tauritz

July 12, 2017: Kate Keahey, Argonne National Laboratory

Chameleon: A Deeply Reconfigurable, Large Scale Instrument for Computer Science Experimentation

Abstract

The last few decades saw an enormous amount of progress in developing the potential of distributed computing: cloud computing, edge computing, and the Internet of Things all inspire new ideas across the stack, from systems solutions to innovative applications. However, complex and distributed systems of this type – or their components – are hard to study using theoretical approaches alone. To study them properly, scientists require a deeply reconfigurable platform where they can explore phenomena in controlled environments, across a range of configurations, and at scales needed to justify their result. Like physicists, biologists, and chemists, computer scientists need an experimental instrument that can serve as a vehicle of discovery.

The Chameleon project has built such an instrument. The testbed, deployed at the University of Chicago and TACC, consists of roughly 15,000 cores, 5PB of total disk space, and leverages 100 Gbps connection between the sites. The backbone of the testbed is configured as a large homogenous partition to support experiments at scale -- on this framework were grafted heterogeneous elements including Infiniband networking, high-bandwidth I/O nodes, storage hierarchy nodes, and GPUs to support a broad diversity of research projects -- and that diversity is further enhanced by smaller clusters of FPGAs, ARM and Atom processors. To support Computer Science experiments, ranging from operating system and virtualization research to innovative applications, Chameleon provides a configuration system giving users full control of the software stack: provisioning of bare metal, support for custom kernel reboot, and console access — but also a fully functioning cloud environment to support educational projects and cloud development.

This talk will describe the goals, the design strategy, and existing and future capabilities of the testbed. I will also present a selection of the research and education projects conducted by our users and describe how they leveraged the features of the testbed. I then discuss the potential that having a well-defined Computer Science instrument creates – in particular, how it can be used to advance reproducibility and sharing for the computer science community.

Host: Curt Canada

July 12, 2017: Dr. T. Alan Keahey, Conversant Media

Bridging the Scale Gap Between Visual Perception and Big Data

Abstract

The human visual perception system is the highest bandwidth sensory channel that we have for ingesting information. It provides unique features of instant feature and pattern recognition that scale well to human-scale datasets. However, the scalability of our visual perception is not infinite, and the limits of it are reached well before we reach the sizes seen in typical "Big Data" applications. There are a number of scalability factors that affect our ability to view larger datasets, including perceptual, visual, interaction, display, analytical scale and available time. This talk will explore some of these differences in scale, and show how a variety of visual and analytical techniques can be combined to help bridge the scale gap between what we are capable of perceiving and the data that we need to understand. During this discussion we will explore some of the specific scalability challenges that we face in the global digital marketing economy, and prior work with IBM Watson for Genomics.

Host: Curt Canada

June 7, 2017: Dr. Simone Silvestri, Missouri S&T

Social-Behavioral Aware Optimization of Cyber Physical Systems

Abstract

Cyber-Human-Physical Systems (CHPS) are a particular class of systems where not only there is an integration between the cyber and the physical component, but where the system operation is inherently affected by human behaviors, and in turn it affects the humans that use its services. The inclusion of such a human component introduces novel challenges in the optimization of the functionality of such systems, since human behavior and psychological perception should be appropriately modeled.

In this talk we first consider the problem of cascading failures in smart grids. We exploit the paradigm of the Internet of Things (IoT) to learn the user habits and realize flexible loads on the user side. The problem involves the System operator, the Load Serving Entities, and the users' Smart Homes. The System Operator detects the contingency and determines how much load needs to be curtailed to prevent a cascading failure. Each Load Serving Entity determines how to distribute the load curtailment to the users. Finally, the Smart Home learns the user habits and calculates the best schedule of appliances in order to minimize the impact on the user habits and meet the load curtailment prescribed by the Load Serving Entity. Subsequently, we consider the problem of optimizing the energy consumption in a single Smart Home. We use large scale online surveys to derive a psychological model of user perception of smart appliances as well as appliances interdependencies. Using these psychological models we formulate an optimization problem to find a schedule of appliances that maximizes the user psychological perception within an energy budget constraint. We show that this problem is NP-Hard and provide an efficient algorithm based on simulated annealing.

Host: Chris Rawlings

Visit Blogger Join Us on Facebook Follow Us on Twitter See our Flickr Photos Watch Our YouTube Videos Find Us on LinkedIn Find Us on iTunesFind Us on GooglePlayFind Us on Instagram