The need to more precisely represent the consequences of congestion mitigation policies in urban transport systems calls for replacement of the static equilibrium assignment by DTA in the integrated travel demand and traffic assignment models. Despite of the availability of DTA models and despite of the conceptual clarity of how such integration should take place, only few operational model systems have been developed for large-scale applications. We report on replacement of the static traffic assignment by two different DTAs in the four stage demand model for the Greater Stockholm region: the macroscopic analytic Visum DUE and microscopic simulation Transmodeler. First results show that even without systematic calibration the DTA is in reasonable agreement with observed traffic counts and travel times. The presented experiments did not reveal striking difference between using macroscopic and microscopic assignment package. However, given the clear trend to microscopic modeling and simulation on the travel demand side, the use of micro-simulation-based DTA package appears more natural from system integration perspective.
Discrete choice models are defined conditional to the analyst's knowledge of the actual choice set. The common practice for many years has been to assume that individual-based choice sets can be deterministically generated on the basis of the choice context and characteristics of the decision maker. This assumption is not valid or not applicable in many situations, and probabilistic choice set formation procedures must be considered. The constrained multinomial logit model (CMNL) has recently been proposed as a convenient way to deal with this issue, as it is also appropriate for models with a large choice set. In this paper, how well the implicit choice set generation of the CMNL approximates the explicit choice set generation is analyzed as described in earlier research. The results based on synthetic data show that the implicit choice set generation model may be a poor approximation of the explicit model
We describe a new approach to the sampling of route choice sets using the Metropolis-Hastingsalgorithm.
Currently, most intersection models embedded in macroscopic Dynamic Network Loading (DNL) models are not well suited for urban and regional applications. This is so because so-called internal supply constraints, bounding flows due to crossing and merging conflicts inherent to the intersection itself, are missing. This paper discusses the problems that arise upon introducing such constraints. A general framework for the distribution of (internal) supply is adopted, which is based on the definition of priority parameters that describe the strength of each flow in the competition for a particular supply. Using this representation, it is shown that intersection models - with realistic behavioral assumptions, and in simple configurations - can produce non-unique flow patterns under identical boundary conditions. This solution non-uniqueness is thoroughly discussed and approaches on how it can be dealt with are provided. Also, it is revealed that the undesirable model properties are not solved - but rather enhanced - when diverting from a point-like to a spatial modeling approach.
In this paper we explore the idea of dimensionality reduction and approximation of OD demand based on principal component analysis (PCA). First, we show how we can apply PCA to linearly transform the high dimensional OD matrices into the lower dimensional space without significant loss of accuracy. Next, we define a new transformed set of variables (demand principal components) that is used to represent the fixed structure of OD matrices in lower dimensional space. We update online these new variables from traffic counts in a novel reduced state space model for real time estimation of OD demand. Through an example we demonstrate the quality improvement of OD estimates using this new formulation and a so-called 'colored' Kalman filter over the standard Kalman filter approach for OD estimation, when correlated measurement noise is accounted due to reduction of variables in state vector.
Microsimulation of urban systems evolution requires synthetic population as a key input. Currently, the focus is on treating synthesis as a fitting problem and thus various techniques have been developed, including Iterative Proportional Fitting (IPF) and Combinatorial Optimization based techniques. The key shortcomings of these procedures include: (a) fitting of one contingency table, while there may be other solutions matching the available data (b) due to cloning rather than true synthesis of the population, losing the heterogeneity that may not have been captured in the microdata (c) over reliance on the accuracy of the data to determine the cloning weights (d) poor scalability with respect to the increase in number of attributes of the synthesized agents. In order to overcome these shortcomings, we propose a Markov Chain Monte Carlo (MCMC) simulation based approach. Partial views of the joint distribution of agent's attributes that are available from various data sources can be used to simulate draws from the original distribution. The real population from Swiss census is used to compare the performance of simulation based synthesis with the standard IPF. The standard root mean square error statistics indicated that even the worst case simulation based synthesis (SRMSE = 0.35) outperformed the best case IPF synthesis (SRMSE = 0.64). We also used this methodology to generate the synthetic population for Brussels, Belgium where the data availability was highly limited.
The estimation and validation of pedestrian behavioral models requires large amounts of detailedand appropriate data, the collection of which is a costly and time-consuming undertaking.The identification and design of an appropriate data collection method therefore is of great importance,which, however, is an arduous and itself time-consuming task. This article describesa software laboratory that facilitates the design of pedestrian data collection campaigns.
This work contributes to the rapid approximation of solutions to optimization problems that are constrained by iteratively solved transport simulations. Given an objective function, a set of candidate decision variables and a black-box transport simulation that is solved by iteratively attaining a (deterministic or stochastic) equilibrium, the proposed method approximates the best decision variable out of the candidate set without having to run the transport simulation to convergence for every single candidate decision variable. This method can be inserted into a broad class of optimization algorithms or search heuristics that implement the following logic: (i) Create variations of a given, currently best decision variable, (ii) identify one out of these variations as the new currently best decision variable, and (iii) iterate steps (i) and (ii) until no further improvement can be attained. A probabilistic and an asymptotic performance bound are established and exploited in the formulation of an efficient heuristic that is tailored towards tight computational budgets. The efficiency of the method is substantiated through a comprehensive simulation study with a non-trivial road pricing problem. The method is compatible with a broad range of simulators and requires minimal parametrization.
The calibration of microsimulation-based demand models is a difficult problem, and there is no source of information that may be disregarded in this process. This work shows how traffic counts can be consistently incorporated in the calibration. The approach is mathematically sound, computationally efficient, and compatible with typical microsimulation techniques
This article reports on the realization and on first applications of the Cadyts (“Calibration ofdynamic traffic simulations”) calibration tool. The presented first version of Cadyts calibratesdisaggregate demand models of dynamic traffic assignment simulators from traffic counts. Thetool is broadly applicable in that it (i) makes only very mild assumptions about the calibratedsimulator’s workings and (ii) allows for various modes of technical interaction with the simulationsoftware. The article provides a both conceptual and technical overview of the tool andexemplary demonstrates its applicability to two different traffic microsimulators.
This chapter describes a computationally efficient technique for the network loading of microscopically represented travel demand with microscopic supply models. Beyond its experimentally demonstrated computational performance, the following features of the demands are worth noting: (i) the microscopic flow model is coupled to the microscopic demand model through a filtering mechanism that effectively removes most vehicle discretization noise; (ii) analytical features such as continuity or differentiability of the macroscopic traffic flow model are preserved despite the microscopic traveler representation.
We consider the previously unsolved problem of sampling paths according to a given distribution from a general network. The problem is difficult because of the combinatorial number of alternatives, which prohibits a complete enumeration of all paths and hence also forbids to compute the normalizing constant of the sampling distribution. The problem is important because the ability to sample from a known distribution introduces mathematical rigor into many applications, including the estimation of choice models with sampling of alternatives that can be formalized as paths in a decision network (most obviously route choice), probabilistic map matching, dynamic traffic assignment, and route guidance.