Commit
·
3af34e3
1
Parent(s):
b5740b7
Update README.md
Browse files
README.md
CHANGED
|
@@ -11,15 +11,14 @@ widget:
|
|
| 11 |
|
| 12 |
---
|
| 13 |
|
| 14 |
-
###
|
| 15 |
|
| 16 |
-
This model was
|
| 17 |
-
|
| 18 |
-
PEGASUS model was originally proposed by Jingqing Zhang, Yao Zhao, Mohammad Saleh and Peter J. Liu in [PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization](https://arxiv.org/pdf/1912.08777.pdf).
|
| 19 |
|
|
|
|
| 20 |
|
| 21 |
### How to use
|
| 22 |
-
We provide a simple snippet of how to use this model for the task of financial summarization in
|
| 23 |
|
| 24 |
```Python
|
| 25 |
from transformers import PegasusTokenizer, PegasusForConditionalGeneration, TFPegasusForConditionalGeneration
|
|
@@ -80,3 +79,8 @@ BibTeX entry:
|
|
| 80 |
year={2021}
|
| 81 |
}
|
| 82 |
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 11 |
|
| 12 |
---
|
| 13 |
|
| 14 |
+
### PEGASUS for Financial Summarization
|
| 15 |
|
| 16 |
+
This model was fine-tuned on a novel financial news dataset, which consists of 2K articles from [Bloomberg](https://www.bloomberg.com/europe), on topics such as stock, markets, currencies, rate and cryptocurrencies.
|
|
|
|
|
|
|
| 17 |
|
| 18 |
+
It is based on the [PEGASUS](https://huggingface.co/transformers/model_doc/pegasus.html) model and in particular PEGASUS fine-tuned on the Extreme Summarization (XSum) dataset:[google/pegasus-xsum model](https://huggingface.co/google/pegasus-xsum). PEGASUS was originally proposed by Jingqing Zhang, Yao Zhao, Mohammad Saleh and Peter J. Liu in [PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization](https://arxiv.org/pdf/1912.08777.pdf).
|
| 19 |
|
| 20 |
### How to use
|
| 21 |
+
We provide a simple snippet of how to use this model for the task of financial summarization in PyTorch.
|
| 22 |
|
| 23 |
```Python
|
| 24 |
from transformers import PegasusTokenizer, PegasusForConditionalGeneration, TFPegasusForConditionalGeneration
|
|
|
|
| 79 |
year={2021}
|
| 80 |
}
|
| 81 |
```
|
| 82 |
+
|
| 83 |
+
## Support
|
| 84 |
+
|
| 85 |
+
Contact us at [[email protected]](mailto:[email protected]) if you are interested in a more sophisticated version of the model, trained on more articles and adapted to your needs!
|
| 86 |
+
|