Abstract:
The success of modern ride-sharing platforms crucially depends on the profit of the ride-sharing fleet operating
companies, and how efficiently the resources are managed.
Further, ride-sharing allows sharing costs and, hence, reduces
the congestion and emission by making better use of vehicle
capacities. In this work, we develop a distributed model-free,
DeepPool, that uses deep Q-network (DQN) techniques to learn
optimal dispatch policies by interacting with the environment.
Further, DeepPool efficiently incorporates travel demand statistics and deep learning models to manage dispatching vehicles for
improved ride-sharing services. Using real-world dataset of taxi
trip records in New York City, DeepPool performs better than
other strategies, proposed in the literature, that do not consider
ride sharing or do not dispatch the vehicles to regions where
the future demand is anticipated. Finally, DeepPool can adapt
rapidly to dynamic environments since it is implemented in a
distributed manner in which each vehicle solves its own DQN
individually without coordination