Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published July 2019 | public
Book Section - Chapter

View-Adaptive Weighted Deep Transfer Learning for Distributed Time-Series Classification


In this paper, we propose an effective, multi-view, deep, transfer learning framework for multivariate time-series data. Though widely used for tasks such as computer vision, the application of transfer learning to time-series classification problems (e.g., classification of light curves) is underexplored. The proposed framework makes several important contributions to facilitate knowledge sharing, while simultaneously ensuring an effective solution for domain specific fine-level categorizations. First, in contrast to the traditional approaches, the proposed framework describes pairwise view similarity by identifying a smaller subset of source-view samples that closely resemble the target data patterns. Second, by means of two-phase learning, a generic baseline model is learned on a larger source data collection and later fine-tuned on a smaller target data collection, precisely approximating the target data patterns. Third, an effective view-adaptive timestamp weighting scheme evaluates the relative importance of each timestamp in a more data-driven manner, which enables a more flexible yet discriminative feature representation scheme in the presence of evolving data characteristics. As shown by experiments, compared to the existing approaches, our proposed deep transfer learning framework improves classification performance by around 2-3% in the UCI multi-view activity recognition dataset, while also showing a robust, generalized representation capacity in classifying several large-scale multi-view light curve collections.

Additional Information

© 2019 IEEE. Funding for this research was provided by the National Science Foundations (NSF) Data Infrastructure Building Blocks (DIBBs) Progam under award #1640818.

Additional details

August 22, 2023
October 20, 2023