Abstract:
Automatically generating financial reports given a piece of breaking macro news is quite challenging task. Essentially, this task is a text-to-text generation problem but is to learn long text, i.e., greater than 40 words, from a piece of short macro news. Moreover, the core component for human beings to generate financial reports is the logic inference given a piece of succinct macro news. To address this issue, we propose the novel multiple edits neural networks which first learns the outline for given news and then generates financial reports from the learnt outline. Particularly, the input news is first embedded via skip-gram model and is then fed into Bi-LSTM component to train the contextual representation vector. This vector is used to learn the latent word probability distribution for the generation of financial reports. To train this end to end neural network model, we have collected one hundred thousand pairs of news-report data. Extensive experiments are performed on this collected dataset. The proposed model achieves the SOTA performance against baseline models w.r.t. the evaluation criteria BLEU, ROUGE and human scores. Although the readability of the generated reports by our approach is better than that of the rest models, it remains an open problem which needs further efforts in the future.