Optimization principles can be applied to several problems and areas. In fact, optimization is used every day — from TV shows that target advertisements to specific population segments to a radiologist that uses selective beam radiation that targets cancer cells in a lung cancer patient.
Many of these problems can be formulated as one of the standard paradigms in optimization, including linear programming, network optimization, nonlinear programming, stochastic optimization, or complementarity or variational inequality. In their research, optimization specialists develop algorithms for these and other types of optimization problems, study their mathematical properties and practical performance to implement them in high-quality software and apply them to practical problems.
An overview of recent projects is given below and more detail about ongoing projects is also available.
Research involves the formulation and analysis of large-scale economic equilibrium models and methodology. Examples include a study on the effect of trade and economic growth in response to global warming. Another study evaluated the effect of trade and economic growth on carbon emissions restrictions. Research projects often collaborate with the Global Trade Analysis Project (GTAP). This organization is a global network of researchers and policy makers that conduct quantitative analysis of international policy issues.
Our conservation-oriented research focuses primarily on examining how drivers of human behavior can elicit changes in the ecological processes and services at the landscape scale. Most of our work centers around either the potential consequences of cellulosic ethanol production in agricultural-dominated landscapes, or examining how predominately forested landscapes are affected under different economic scenarios. In addition to addressing research questions, our projects also strive to make current scientific concepts more accessible to non-scientists through the development of engaging and interactive computational tools and educational games. Learn more about these projects.
Operational (tactical) and strategic mathematical models are used in the optimization decision process to determine more efficient and effective ways to control and manage a system or process. Mathematical models are routinely used in natural sciences (physics, biology), engineering (artificial intelligence, computer science) and social science (economics, political science). However, the lack of agreement between theoretical (mathematical) models and experimental measurement in any discipline often leads to important advances since better theories are developed. Optimization helps create better theories and improves systems/processes in many disciplines. The biggest challenge is to engage the designer, collect the appropriate data and determine the appropriate model (linear vs. nonlinear, deterministic vs. probabilistic, static vs. dynamic or discrete vs. continuous). Examples of some research projects include the optimization of a surgery room utilization and a hospital’s resident rotation.
A problem often encountered by biologists centers around defining appropriate protein and drug (factor) concentrations in a mixture of signaling factors to realize a desired biological outcome. For example, this outcome may be realized in the production of a particular cell-type to be used either as a tool for discovery or a therapeutic. Current high-throughput robotic screening technologies allow for single factors at a single concentration to be dispensed into a single well. Unfortunately this strategy limits the experimental scope, as it is expensive in both time and capital. We are defining a strategy, “combinatorial factor screening,” as a method to overcome these obstacles. This strategy utilizes a standard liquid handling platform capitalizing on the knowledge of how to control the pipetting software through an in-house software model. This strategy may be applied in both fields of biology and chemistry in both academia and pharmaceutical industries…more
Large data sets require statistical analysis, but some data may come in varied forms (text, audio, video, sensor data, etc.). To cope with data structure variations, the HAZY project explores integrating statistical processing techniques with data processing systems to make such systems easer to build, maintain and deploy. Jellyfish is one example of software that was recently released for large-scale matrix completion.
Mass digitization of printed media into plain test is changing the types of data that companies and academic institutions manage. Scanned images are converted to plain text by conversion software, but often the software is error-prone. Any query of the digitized media may miss information that may lead to a poor results or applications. Staccato software improves the digitization process.
Another project involves packing ellipsoids of given (but varying) dimensions into a finite container in a way that minimizes the maximum overlap between adjacent ellipsoids. A bilevel optimization algorithm is described for finding local solutions, for both the general case and the easier special case in which the ellipsoids are spheres. Algorithm and analysis tools from semi-definite programming and trust-region methods are key to the approach. The goal is to apply the method to the problem of chromosome arrangement in cell nuclei, and compare our results with the experimental observations reported in the biological literature.
One research project determines the optimum electrical power flow and another examines reconfiguring power systems. Since changes in energy generation, storage and flow has regional and national implications, most energy projects involve extensive collaboration with state and/or federal agencies. As an example, a recent bioenergy project involved collaboration between four agencies. The purpose of the bioenergy project was to develop an interactive tool and companion educational game to increase stakeholder engagement and accelerate policy development for regional biofuel production.
The VIDI project focuses on the creative use of emerging nonintrusive sensor and computer vision technologies to build computer interfaces that allow users to interact with sounds and music through physical motion in a large space. Our VIDI system has a two-staged structure: (1) acquisition focuses on capturing specific measures of movement and behavior, and their representation with appropriate discrete data streams, (2) musical content generation uses these motion descriptors and body activity to generate a stream of Musical Instrument Digital Interface (MIDI) data and, optionally, provides an accompanying visual or light output. VIDI systems have been used to create interactive music/sculpture installations, live music/dance collaborations, and new interfaces for skilled performing musicians. VIDI projects are currently being developed by Chris Walczak, Anthony Caulkins and UW Professor of Music Composition Stephen Dembski. Nathaniel Bartlett previously worked on the VIDI project and created five videos called (((CLANG))) that demonstrate the use of VIDI technology.