Follow this link to skip to the main content
JPL - Home Page JPL - Earth JPL - Solar System JPL - Stars & Galaxies JPL - Science & Technology
  NASA Logo
Jet Propulsion Laboratory
California Institute of Technology
BRING THE UNIVERSE TO YOU: 
JPL Toolbar
Search JPL
Technology Selection & Risk Assessment
START Home Red Block
Middle Red Block
Case Studies
white line
About Us
white line
Methodology
white line
Case Studies
white line
 Overview
white line
 NASA agency-level studies:
     START Lite
     Technology Development
     Assessing Future Missions
     Integrated Resource Allocation
white line
 Science (SMD)
     Carbon Uncertainty
     JPL Chief Technologist Analysis
     New Millennium Program

     Enabling Mars Missions:
       - Biosignature Detection
       - Landing Site Selection
       - Selecting Technologies
       - Lander vs Rover
       - Autonomy
       - Hazard Avoidance
       - Predicting Technology Cost
       - Automated Design Tool

     Titan:
       - Science Traceability Matrix
       - Mission Architecture

     Europa

     Space Telescopes:
       - Tech Investment Tools
       - Earth Observatory at L2
white line
 Exploration (ESMD)

     Astronauts on Asteroid

     Asteroid surveyors

     Schrödinger Mission

     Shackleton-Malapert Mission

     Tech Prioritization for
Constellation


     Human-Robot Missions
       - Comparing Architectures

       - Task Scheduling
          - Allocating Tasks 1
          - Allocating Tasks 2

       - Lunar Mission Pilot
          - Human-Robot Polar Mission
          - Robotic Precursor Mission

       - Performance Improvements
       - System Architectures

     Autonomous Inspection
white line
 Aeronautics (ARMD)

     Capability Assessment
white line
white line
Publications & Proceedings
white line
News
white line
Sitemap
white line
Large Space Telescopes Level 2 Banner

Tools for Improving Technology Investment Strategies
(As Illustrated by Large Space Telescopes)

How can fundamental technology investment strategies be planned to maximize return on investment, and to facilitate transition of this technology to its users?

What enabling technologies provide the best return on investment?

This study outlines the structure of a process that can help make decisions about advanced technology investment. The process conducts a systematic flow-down analysis of scientific investigation goals and measurements for a mission class, of the corresponding mission and system concepts, and of the enabling technologies needed to achieve the desired investigations.

A series of examples elucidate how a systematic technology investment process can make substantial contributions toward binding confluent interests in science and technology. The examples can be generalized to a broader scope of investigations, mission concepts and enabling technologies.

Structure of Investment Guidance Process

Science-Driven Investment Guidance Process

The process begins at the top level, labeled as Step 1 in the figure, with a definition of the scientific goals and measurements that are required to perform a set of desired investigations. In succeeding steps, mission and system parameters are identified and compiled, and enabling technology portfolios are analyzed and evaluated. Steps 1-4 aim at evaluating technologies, after projecting science return and system performance parameters. Step 5 involves iteration of technology portfolios, while simultaneously modifying technology performance parameters. These first 5 steps are the focus of the examples shown below. Steps 6 and 7 involve respectively iterative modification of mission and system concepts and scientific investigations.

Science Return and System Capabilities

An example of the type of data emerging from the process is the data about scientific return and system capabilities for a large number of telescope systems.

Telescope Science Goals

This information provided a means to evaluate the system capabilities needed to achieve various scientific investigations, and to identify the enabling technologies needed for these systems.

Hierarchy of Enabling Technologies

Identifying technologies needed to enable required system capabilities is one of the critical steps in developing technology development strategies. The enabling technologies are usually specified at different levels of detail. Some technologies are specified at the component level, while others are at the sub-system and system level. To account for this difference in detail, the technology data is organized into a tree-like hierarchy that has technology areas at the top level, and decomposes into many branches consisting of various technology sub-areas and technology elements. An example shows how a set of technologies for telescope systems, including robotic assembly, detectors and other enabling technologies, would be organized into a tree-like hierarchy.

Large Aperture Telescope Technology

This tree-like hierarchy is useful because it is "democratic", in the sense that every level in the hierarchy treats all of its immediate subordinates as having equal value. This democratic approach is thought to be the least controversial, as it does not presuppose a preference for any one technology over others. The approach treats all technologies as having equal value. It is easy, however, to assign different values for different technologies when necessary, but we usually do so only upon consultation with authoritative experts in the application of interest. Other hierarchical arrangements, such as generally interconnected networks and graphs, where some technologies may have more value than others, are possible within our analysis framework.

For each of the enabling technologies in the hierarchical tree, the following data is typically collected

  • List of quantitative and/or qualitative parameters that characterize technological performance
  • Technology Readiness Level is a technological performance parameter in this list
  • State-of-the-Art (SOA) values for these parameters
  • Time-period or periods over which a performance forecast is desired
  • Range of performance values that may be achieved over this time period
  • Estimated cost to achieve the forecasted performance
  • Scheduled delivery date
  • Risk level

Estimating Advances in Technology Areas

One of the fundamental questions addressed by this study is: how does any given technological area advance in a collective sense, given that the area is made up of many individual technology elements? For example, how much must the area of robotic assembly advance in going from its current state-of-the-art to prescribed technology performance targets? In addressing this question, we think of data about each performance parameter as a technical opinion coming from an individual expert in that parameter. The performance data collected for the 7 areas comes from a variety of individual sources, all of which must be combined in some way to estimate the technological advance in each of the areas. To do this, we use the method of least squares, which computes the average value for technology advancement, given all of the performance data obtained from many individual experts.

Performance Gain to Meet Technology Targets (Including Costs)

The average performance advance needed to meet technology targets for 10m, 35m, and 100m telescope systems is shown in the figure, for each of the seven technology areas. In this chart, we have used numbers instead of explicit names for the technology areas, because iterations on this analysis are still under way. Such iterations are an integral step in the overall technology evaluation process as explained subsequently.

The estimates represented by the horizontal bars in the figure were obtained by using the method of least squares, using data of about 100 performance parameters. The average performance advance or gain provides the decision maker with the ability to forecast how much technology development is needed in each of the areas. These results apply to the "democratic" approach taken in this particular study. Investment costs are considered in the estimates in the figure, by including them as weighting parameters in the least-squares method.

Investment Strategies

In deciding about technology investment, the decision-maker has a number of strategies available. One of the options is to use the average cost-weighted estimates of technology advancement as a means to decide how to allocate a limited budget. The investment allocation can be made proportional to the estimated technology advance for each area, and this proportional allocation can be displayed as percentage of a given total budget, as shown in the "pie-chart" below.

Technology Pie Chart

The decision-maker can exercise other options for allocation of funds. For example, another option examines how to allocate a severely constrained budget into the various sub-areas, by first selecting from each area those elements' benefit-to-cost ratio, and allocating the budget to each area so that these selected elements are funded first, while those technology elements with small benefit-to-cost ratio are funded later. Yet another investment strategy would involve first funding technologies with the smallest benefit-to-cost ratio, in order to stimulate them to systematically increase this ratio by means of technology development.

Iterative Process

Decision and System Model Interaction

The type of analysis required for decision-making involves relatively large-scale issues, such as forecasting development of entire technological areas, over periods of many years, and is typically done at a level of abstraction appropriate for technology development program planning, monitoring and management. This level of abstraction is illustrated within the box entitled Decision Analysis Methods & Models. Although focusing on long-term issues, such as the trends of development in a technology area, the decision analysis is most useful if these models are rooted in more detailed models and performance data for specific systems. These two levels of models and analysis can interact very fruitfully, by leading to technology evaluation results that are "rooted" in physical models and performance data for physical systems. To achieve this type of mutually reinforcing analysis approach for technology evaluation, where there is an interaction between decision models and physical models and data, requires that the data generation and/or updating be done in conjunction with the technology evaluation step.

The iterative approach between data generation and technology evaluation just described is best achieved by one or more individuals who are aware of all the complex issues involved in both aspects of the iterative approach. There is no substitute for good judgment when examining the results of these types of analysis. The technology-evaluation entity should have knowledge of all tiers in the technology taxonomy for which data is collected, as well as good fundamental understanding of their individual confounding factors.

For more information, contact: Charles.R.Weisbin@jpl.nasa.gov


  About | Methodology | Case Studies | Publications & Proceedings | News | Sitemap | Home

PRIVACY / COPYRIGHT IMAGE POLICY CONTACT INFORMATION CREDITS
  NASA Home Page   Primary START Contact: Charles R Weisbin
  Last Updated: January 24, 2013