This paper develops a passenger-based priority for transit buses by balancing the trade-offs between the benefits at major streets and delays on side streets. A rule-based transit signal priority (TSP) is set to assign priority to scheduled-based transit vehicles based on: their schedule adherence, passenger occupancy, and passengers waiting at downstream stops. The minimum number of bus passengers required to receive priority is obtained using deterministic queueing theory for the two cases of green extension and red truncation. VISSIM simulation software is used to evaluate the performance of the developed TSP approach, comparing it with: existing, unconditional and no-TSP scenarios. This evaluation assessed performance measures for major streets, crossing streets and the network level. The simulation demonstrated that a passenger-based TSP results in a significant decrease in travel time and side street delay, compared to existing priority measures (buses will receive priority once every three minutes) on the corridor.
Ridesharing has become a popular model that enables users to share their rides with others in recent years. In this paper, we introduce a novel ridesharing service, an Activity-aware Ridesharing Group Trip Planning (ARGTP) query in road networks that exhibits three novel features: (i) ensures a complete trip for visiting more than two locations, (ii) allows visiting both fixed and flexible locations, and (iii) provides true ridesharing services instead of taxi-like ridesourcing services by matching a group of riders' flexible trips with a driver's fixed trip. A trip visits a point-of-interest (POI) like a bank, restaurant or supermarket for an activity in between source and destination locations. In a fixed trip, a driver visits a predetermined POI (e.g., a specific branch of a bank) and in a flexible trip, a rider is happy to visit any POI (e.g., any branch of a bank). An ARGTP query considers the spatial proximity of the riders' trips with a driver's trip, and returns an optimal ridesharing group that minimizes the group cost. We develop the first solution to process ARGTP queries in real time. The efficiency of ARGTP query processing algorithms depends on the number of candidate riders and the number of POIs to be explored. We introduce novel pruning techniques to prune the riders and refine the POI search space. We perform extensive experiments using both real and synthetic datasets to validate the efficiency and effectiveness of our approach, and show that our approach outperforms a baseline approach with a large margin.
At the present time, there is no mechanism for Air Navigation Service Providers to probe new flight plans filed by the Airlines Operation Centers against the existing approved flight plans to see if they are likely to cause conflicts or bring sector traffic densities beyond control. In the current Air Traffic Control operations, aircraft conflicts and sector traffic densities are resolved tactically, increasing workload and leading to potential safety risks and loss of capacity and efficiency. We propose a novel Data-Driven Framework to address a long-range aircraft conflict detection and resolution (CDR) problem. Given a set of predicted trajectories, the framework declares a conflict when a protected zone of an aircraft on its trajectory is infringed upon by another aircraft. The framework resolves the conflict by prescribing an alternative solution that is optimized by perturbing at least one of the trajectories involved in the conflict. To achieve this, the framework learns from descriptive patterns of historical trajectories and pertinent weather observations and builds a Hidden Markov Model. Using a variant of the Viterbi algorithm, the framework avoids the airspace volume in which the conflict is detected and generates a new optimal trajectory. The key concept upon which the framework is built is the assumption that the airspace is nothing more than a horizontally and vertically concatenated set of spatio-temporal data cubes where each cube is considered as an atomic unit. We evaluate our framework using real trajectory datasets with pertinent weather observations from two continents and demonstrate its effectiveness.
We present LOOM (Line-Ordering Optimized Maps), a fully automatic generator of geographically accurate transit maps. The input to LOOM is data about the lines of a given transit network, namely for each line, the sequence of stations it serves and the geographical course the vehicles of this line take. We parse this data from GTFS, the prevailing standard for public transit data. LOOM proceeds in four stages: (1) construct a so-called line graph, where edges correspond to segments of the network with the same set of lines following the same course; (2) compute an optimal partial ordering of the lines by a process we call graph untangling; (3) construct an ILP that yields a line ordering for each edge which minimizes the total number of line crossings and line separations; (4) based on the line graph and the computed line ordering, draw the map. Since our maps respect the geography of the transit network, they can be used as overlays in typical map services. Previous research work either did not take the geographical course of the lines into account, or was concerned with schematic maps without optimizing line crossings or separations. We evaluate LOOM on six real-world transit networks, with line-ordering search-space sizes up to 2*10^267. Using graph untangling and a sophisticated ILP formulation, we can compute optimal line orderings in a fraction of a second for all networks. This enables real-time updates and interactive use in map editors. We also compare our automatically computed maps to manually designed maps.
An important problem in terrain analysis is modeling how water flows across a terrain and creates floods by filling up depressions. In this paper we study a number of flood-risk related problems: Given a terrain ?, with n vertices, a rain distribution R and a volume of rain ? , determine which portions of ? are flooded. We develop efficient algorithms for flood-risk analysis under the multiflow-direction (MFD) model, in which water at a point can flow along multiple downslope edges and which more accurately represent flooding events. We present three main results: First, we present an O(n log n)-time algorithm to answer a terrain-flood query: if it rains a volume ? according to a rain distribution R, determine what regions of ? will be flooded. Second, we present a algorithm for preprocessing ? containing m sinks into a data structure of size O(nm) for answering point-flood queries: given a volume ?, distribution R, and point q ? ?, determine whether q will be flooded. A point-flood query can be answered in O(|R|k + k^2 ) time, where k is the number of maximal depressions in ? containing q and |R| is the number of vertices in R with positive rain fall. Finally, we present algorithms for answering a flood-time query: given a rain distribution R and a point q ? ?, determine the volume of rain that must fall before q is flooded. We implemented our terrain-flooding algorithm and tested its efficacy and efficiency on real terrains.
Location data from mobile devices is a sensitive yet valuable commodity for location-based services and advertising. We investigate the intrinsic value of location data in the context of strong privacy, where location information is only available from end users via purchase. We present an algorithm to compute the expected value of location data from a user, without access to the specific coordinates of the location data point. We use decision-theoretic techniques to provide a principled way for a potential buyer to make purchasing decisions about private user location data. We illustrate our approach in three scenarios: the delivery of targeted ads specific to a user?s home location, the estimation of traffic speed, and location prediction. In all three cases, the methodology leads to quantifiably better purchasing decisions than competing methods.
There are many real world applications that require identifying crowd movements such as identifying movement corridors in cities and most popular paths. If one is not given user trajectories but rather sporadic location data, such as location based social network data, finding movement related information becomes difficult. Rather than processing all points in a data set given a query, a clever approach is to construct a graph, based on user locations, and query this graph for answer questions such as shortest paths, most popular paths and movement corridors. A popular graph is the shortest path graph. However the shortest path graph can be inefficient and ineffective for analysing movement data, as it calculates the graph edges considering the shortest paths over all the points in a data set. Therefore, edge sets resulting from shortest path graphs are usually very restrictive and not suitable for movement analysis because of its global view of the data set. We propose the stepping stone graph, which calculates the graph considering point pairs rather than all points, that focuses on possible local movements, making it both efficient and effective for location based social network related data. We demonstrate its uses by applying it in the Location Based Social Network domain and comparing with the shortest path graph. We also compare its properties to a range of other graphs and demonstrate how stepping stone graph relates to Gabriel graph, relative neighbourhood graph and Delaunay triangulation.