Update README.md
Browse files
    	
        README.md
    CHANGED
    
    | 
         @@ -20,11 +20,7 @@ Phi-1.5 can write poems, draft emails, create stories, summarize texts, write Py 
     | 
|
| 20 | 
         | 
| 21 | 
         
             
            ## How to Use
         
     | 
| 22 | 
         | 
| 23 | 
         
            -
            Phi-1.5 has been integrated in the `transformers` version 4.37.0 
     | 
| 24 | 
         
            -
             
     | 
| 25 | 
         
            -
            * When loading the model, ensure that `trust_remote_code=True` is passed as an argument of the `from_pretrained()` function.
         
     | 
| 26 | 
         
            -
             
     | 
| 27 | 
         
            -
            The current `transformers` version can be verified with: `pip list | grep transformers`.
         
     | 
| 28 | 
         | 
| 29 | 
         
             
            ## Intended Uses
         
     | 
| 30 | 
         | 
| 
         @@ -91,8 +87,6 @@ where the model generates the text after the comments. 
     | 
|
| 91 | 
         | 
| 92 | 
         
             
            * Phi-1.5 has not been tested to ensure that it performs adequately for any production-level application. Please refer to the limitation sections of this document for more details.
         
     | 
| 93 | 
         | 
| 94 | 
         
            -
            * If you are using `transformers<4.37.0`, always load the model with `trust_remote_code=True` to prevent side-effects.
         
     | 
| 95 | 
         
            -
             
     | 
| 96 | 
         
             
            ## Sample Code
         
     | 
| 97 | 
         | 
| 98 | 
         
             
            ```python
         
     | 
| 
         @@ -101,8 +95,8 @@ from transformers import AutoModelForCausalLM, AutoTokenizer 
     | 
|
| 101 | 
         | 
| 102 | 
         
             
            torch.set_default_device("cuda")
         
     | 
| 103 | 
         | 
| 104 | 
         
            -
            model = AutoModelForCausalLM.from_pretrained("microsoft/phi-1_5", torch_dtype="auto" 
     | 
| 105 | 
         
            -
            tokenizer = AutoTokenizer.from_pretrained("microsoft/phi-1_5" 
     | 
| 106 | 
         | 
| 107 | 
         
             
            inputs = tokenizer('''def print_prime(n):
         
     | 
| 108 | 
         
             
               """
         
     | 
| 
         | 
|
| 20 | 
         | 
| 21 | 
         
             
            ## How to Use
         
     | 
| 22 | 
         | 
| 23 | 
         
            +
            Phi-1.5 has been integrated in the `transformers` version 4.37.0, please ensure that you are using a version equal or higher than it.
         
     | 
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 24 | 
         | 
| 25 | 
         
             
            ## Intended Uses
         
     | 
| 26 | 
         | 
| 
         | 
|
| 87 | 
         | 
| 88 | 
         
             
            * Phi-1.5 has not been tested to ensure that it performs adequately for any production-level application. Please refer to the limitation sections of this document for more details.
         
     | 
| 89 | 
         | 
| 
         | 
|
| 
         | 
|
| 90 | 
         
             
            ## Sample Code
         
     | 
| 91 | 
         | 
| 92 | 
         
             
            ```python
         
     | 
| 
         | 
|
| 95 | 
         | 
| 96 | 
         
             
            torch.set_default_device("cuda")
         
     | 
| 97 | 
         | 
| 98 | 
         
            +
            model = AutoModelForCausalLM.from_pretrained("microsoft/phi-1_5", torch_dtype="auto")
         
     | 
| 99 | 
         
            +
            tokenizer = AutoTokenizer.from_pretrained("microsoft/phi-1_5")
         
     | 
| 100 | 
         | 
| 101 | 
         
             
            inputs = tokenizer('''def print_prime(n):
         
     | 
| 102 | 
         
             
               """
         
     |