Science Fair Project Encyclopedia
Transportation planning historically has followed the Rational Planning model of Defining Goals and Objectives, Identifying Problems, Generating Alternatives, Evaluating Alternatives, and Developing the Plan. Other models for planning include Rational actor , Satisficing, Incremental planning , Organizational process , and Political bargaining .
Within the rational planning framework, transportation forecasts have traditionally followed the sequential four-step model or urban transportation planning (UTP) procedure, first implemented on mainframe computers in the 1950s at the Detroit Area Transportation Study (DATS) and Chicago Area Transportation Study (CATS).
Land use forecasting sets the stage for the process. Typically, forecasts are made for the region as a whole, e.g., of population growth. Such forecasts provide control totals for the local land use analysis. Typically, the region is divided into zones and by trend or regression analysis, the population and employment are determined for each.
The four steps of the classical urban transportation planning system model are:
- Trip generation determines the frequency of origins or destinations of trips in each zone by trip purpose, as a function of land uses and household demographics, and other socio-economic factors
- Trip distribution matches origins with destinations, often using a gravity model function, equivalent to an entropy maximizing model. Older models include the fratar model.
- Mode choice computes the proportion of trips between each origin and destination that use a particular transportation mode. This model is often of the logit form, developed by Nobel Prize winner Daniel McFadden
- Route assignment allocates trips between an origin and destination by a particular mode to a route. Often (for highway route assignment) Wardrop's principle of user equilibrium is applied (equivalent to a Nash equilibrium), wherein each traveler chooses the shortest (travel time) path, subject to every other driver doing the same. The difficulty is that travel times are a function of demand, while demand is a function of travel time.
After the classical model, evaluative decision criteria are applied. A typical criterion is benefit-cost analysis. Such analysis might be applied after the network assignment model identifies needed capacity: is such capacity worthwhile? In addition to identifying the forecasting and decision steps as additional steps in the process, it is important to note that forecasting and decision-making permeate each step in the UTP process. Planning deals with the future, and it is forecasting dependent.
Although not identified as steps in the UTP process, a lot of data gathering is involved in the UTP analysis process. Census and land use data are obtained, and there are home interview surveys. Home interview surveys, land use data, and special trip attraction surveys provide the information on which the UTP analysis tools are exercised.
Data collection, management, and processing; model estimation; and use of models to yield plans are much used techniques in the UTP process. In the early days, census data was augmented that with data collection methods that had been developed by the Bureau of Public Roads (a predecessor of the Federal Highway Administration): traffic counting procedures, cordon “where are you coming from and where are you going” counts, and home interview techniques. Protocols for coding networks and the notion of analysis or traffic zones emerged at the CATS.
Model estimation used existing techniques, and plans were developed using whatever models had been developed in a study. The main difference between today and yesterday is the development of some analytic resources specific to transportation planning, in addition to the BPR data acquisition techniques used in the early days.
The sequential and aggregate nature of transportation forecasting has come under much criticism. While improvements have been made, in particular giving an activity-base to travel demand, much remains to be done. In the 1990s most federal investment in model research went to the Transims project at Los Alamos National Laboratory, giving physicists a crack at the problem. While the use of supercomputers and the detailed simulations may be an improvement on practice, they have yet to be shown to be better (more accurate) than conventional models. The government sold the rights to redistribute Transims to a national consultancy KPMG rather than make it open source.
The contents of this article is licensed from www.wikipedia.org under the GNU Free Documentation License. Click here to see the transparent copy and copyright details