--- language: - en pipeline_tag: text2text-generation --- ```python from transformers import T5ForConditionalGeneration from transformers import T5TokenizerFast as T5Tokenizer import pandas as pd model = "svjack/comet-atomic-en" device = "cpu" #device = "cuda:0" tokenizer = T5Tokenizer.from_pretrained(model) model = T5ForConditionalGeneration.from_pretrained(model).to(device).eval() NEED_PREFIX = 'What are the necessary preconditions for the next event?' EFFECT_PREFIX = 'What could happen after the next event?' INTENT_PREFIX = 'What is the motivation for the next event?' REACT_PREFIX = 'What are your feelings after the following event?' event = "X had a big meal." for prefix in [NEED_PREFIX, EFFECT_PREFIX, INTENT_PREFIX, REACT_PREFIX]: prompt = "{}{}".format(prefix, event) encode = tokenizer(prompt, return_tensors='pt').to(device) answer = model.generate(encode.input_ids, max_length = 128, num_beams=2, top_p = 0.95, top_k = 50, repetition_penalty = 2.5, length_penalty=1.0, early_stopping=True, )[0] decoded = tokenizer.decode(answer, skip_special_tokens=True) print(prompt, "\n---Answer:", decoded, "----\n") ```
```json What are the necessary preconditions for the next event?X had a big meal. ---Answer: X goes shopping at the supermarket ---- What could happen after the next event?X had a big meal. ---Answer: X gets fat ---- What is the motivation for the next event?X had a big meal. ---Answer: X wants to eat ---- What are your feelings after the following event?X had a big meal. ---Answer: X tastes good ---- ```