Fetching metadata from the HF Docker repository... 
 
	
				
				
				
- 
				model
			
			no message
				
				
		
- 
			1.52 kB
				
				
			initial commit
				
				
		
- 
			279 Bytes
				
				
			initial commit
				
				
		
- 
			2.88 kB
				
				
			Increased number of documents to 50000
				
				
		
- 
			215 Bytes
				
				
			no message
				
				
		
- 
			64 Bytes
				
				
			Updated requirements.txt file
				
				
		
- 
					retriever.pkl
				
					
- 
		
		- Detected Pickle imports (35)
- "collections.OrderedDict",
						
- "tokenizers.models.Model",
						
- "tokenizers.AddedToken",
						
- "transformers.models.bert.modeling_bert.BertAttention",
						
- "transformers.models.bert.configuration_bert.BertConfig",
						
- "sentence_transformers.models.Pooling.Pooling",
						
- "torch._utils._rebuild_tensor_v2",
						
- "torch.nn.modules.linear.Linear",
						
- "torch.nn.modules.sparse.Embedding",
						
- "transformers.models.bert.tokenization_bert_fast.BertTokenizerFast",
						
- "transformers.models.bert.modeling_bert.BertLayer",
						
- "torch.storage._load_from_bytes",
						
- "model.retriever.Retriever",
						
- "rank_bm25.BM25Okapi",
						
- "transformers.models.bert.modeling_bert.BertEmbeddings",
						
- "torch.torch_version.TorchVersion",
						
- "transformers.models.bert.modeling_bert.BertOutput",
						
- "transformers.models.bert.modeling_bert.BertModel",
						
- "sentence_transformers.models.Transformer.Transformer",
						
- "transformers.models.bert.modeling_bert.BertSdpaSelfAttention",
						
- "transformers.models.bert.modeling_bert.BertSelfOutput",
						
- "tokenizers.Tokenizer",
						
- "torch.nn.modules.activation.Tanh",
						
- "torch.nn.modules.container.ModuleList",
						
- "torch._C._nn.gelu",
						
- "sentence_transformers.SentenceTransformer.SentenceTransformer",
						
- "transformers.models.bert.modeling_bert.BertEncoder",
						
- "sentence_transformers.model_card.SentenceTransformerModelCardData",
						
- "transformers.models.bert.modeling_bert.BertIntermediate",
						
- "sentence_transformers.models.Normalize.Normalize",
						
- "transformers.activations.GELUActivation",
						
- "torch._utils._rebuild_parameter",
						
- "transformers.models.bert.modeling_bert.BertPooler",
						
- "torch.nn.modules.normalization.LayerNorm",
						
- "torch.nn.modules.dropout.Dropout"
						
 - 
						How to fix it? 
 
 317 MB
				
				
			Increased number of documents to 50000