Upload Salesforce_CoDA-v0-Instruct_0.txt with huggingface_hub
Browse files
Salesforce_CoDA-v0-Instruct_0.txt
ADDED
|
@@ -0,0 +1,44 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
Traceback (most recent call last):
|
| 2 |
+
File "/tmp/Salesforce_CoDA-v0-Instruct_04pDbPU.py", line 16, in <module>
|
| 3 |
+
pipe = pipeline("text-generation", model="Salesforce/CoDA-v0-Instruct", trust_remote_code=True)
|
| 4 |
+
File "/tmp/.cache/uv/environments-v2/e8b6185b99c36290/lib/python3.13/site-packages/transformers/pipelines/__init__.py", line 1027, in pipeline
|
| 5 |
+
framework, model = infer_framework_load_model(
|
| 6 |
+
~~~~~~~~~~~~~~~~~~~~~~~~~~^
|
| 7 |
+
adapter_path if adapter_path is not None else model,
|
| 8 |
+
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
| 9 |
+
...<5 lines>...
|
| 10 |
+
**model_kwargs,
|
| 11 |
+
^^^^^^^^^^^^^^^
|
| 12 |
+
)
|
| 13 |
+
^
|
| 14 |
+
File "/tmp/.cache/uv/environments-v2/e8b6185b99c36290/lib/python3.13/site-packages/transformers/pipelines/base.py", line 333, in infer_framework_load_model
|
| 15 |
+
raise ValueError(
|
| 16 |
+
f"Could not load model {model} with any of the following classes: {class_tuple}. See the original errors:\n\n{error}\n"
|
| 17 |
+
)
|
| 18 |
+
ValueError: Could not load model Salesforce/CoDA-v0-Instruct with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>,). See the original errors:
|
| 19 |
+
|
| 20 |
+
while loading with AutoModelForCausalLM, an error is thrown:
|
| 21 |
+
Traceback (most recent call last):
|
| 22 |
+
File "/tmp/.cache/uv/environments-v2/e8b6185b99c36290/lib/python3.13/site-packages/transformers/pipelines/base.py", line 293, in infer_framework_load_model
|
| 23 |
+
model = model_class.from_pretrained(model, **kwargs)
|
| 24 |
+
File "/tmp/.cache/uv/environments-v2/e8b6185b99c36290/lib/python3.13/site-packages/transformers/models/auto/auto_factory.py", line 607, in from_pretrained
|
| 25 |
+
raise ValueError(
|
| 26 |
+
...<2 lines>...
|
| 27 |
+
)
|
| 28 |
+
ValueError: Unrecognized configuration class <class 'transformers_modules.Salesforce.CoDA_hyphen_v0_hyphen_Instruct.89d646aff12a3c17079eadba13bd784e74337b41.model_config.CoDAConfig'> for this kind of AutoModel: AutoModelForCausalLM.
|
| 29 |
+
Model type should be one of ApertusConfig, ArceeConfig, AriaTextConfig, BambaConfig, BartConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BioGptConfig, BitNetConfig, BlenderbotConfig, BlenderbotSmallConfig, BloomConfig, BltConfig, CamembertConfig, LlamaConfig, CodeGenConfig, CohereConfig, Cohere2Config, CpmAntConfig, CTRLConfig, Data2VecTextConfig, DbrxConfig, DeepseekV2Config, DeepseekV3Config, DiffLlamaConfig, DogeConfig, Dots1Config, ElectraConfig, Emu3Config, ErnieConfig, Ernie4_5Config, Ernie4_5_MoeConfig, Exaone4Config, FalconConfig, FalconH1Config, FalconMambaConfig, FlexOlmoConfig, FuyuConfig, GemmaConfig, Gemma2Config, Gemma3Config, Gemma3TextConfig, Gemma3nConfig, Gemma3nTextConfig, GitConfig, GlmConfig, Glm4Config, Glm4MoeConfig, GotOcr2Config, GPT2Config, GPT2Config, GPTBigCodeConfig, GPTNeoConfig, GPTNeoXConfig, GPTNeoXJapaneseConfig, GptOssConfig, GPTJConfig, GraniteConfig, GraniteMoeConfig, GraniteMoeHybridConfig, GraniteMoeSharedConfig, HeliumConfig, HunYuanDenseV1Config, HunYuanMoEV1Config, JambaConfig, JetMoeConfig, Lfm2Config, LlamaConfig, Llama4Config, Llama4TextConfig, LongcatFlashConfig, MambaConfig, Mamba2Config, MarianConfig, MBartConfig, MegaConfig, MegatronBertConfig, MiniMaxConfig, MinistralConfig, MistralConfig, MixtralConfig, MllamaConfig, ModernBertDecoderConfig, MoshiConfig, MptConfig, MusicgenConfig, MusicgenMelodyConfig, MvpConfig, NemotronConfig, OlmoConfig, Olmo2Config, Olmo3Config, OlmoeConfig, OpenLlamaConfig, OpenAIGPTConfig, OPTConfig, PegasusConfig, PersimmonConfig, PhiConfig, Phi3Config, Phi4MultimodalConfig, PhimoeConfig, PLBartConfig, ProphetNetConfig, QDQBertConfig, Qwen2Config, Qwen2MoeConfig, Qwen3Config, Qwen3MoeConfig, Qwen3NextConfig, RecurrentGemmaConfig, ReformerConfig, RemBertConfig, RobertaConfig, RobertaPreLayerNormConfig, RoCBertConfig, RoFormerConfig, RwkvConfig, SeedOssConfig, SmolLM3Config, Speech2Text2Config, StableLmConfig, Starcoder2Config, TransfoXLConfig, TrOCRConfig, VaultGemmaConfig, WhisperConfig, XGLMConfig, XLMConfig, XLMProphetNetConfig, XLMRobertaConfig, XLMRobertaXLConfig, XLNetConfig, xLSTMConfig, XmodConfig, ZambaConfig, Zamba2Config.
|
| 30 |
+
|
| 31 |
+
During handling of the above exception, another exception occurred:
|
| 32 |
+
|
| 33 |
+
Traceback (most recent call last):
|
| 34 |
+
File "/tmp/.cache/uv/environments-v2/e8b6185b99c36290/lib/python3.13/site-packages/transformers/pipelines/base.py", line 311, in infer_framework_load_model
|
| 35 |
+
model = model_class.from_pretrained(model, **fp32_kwargs)
|
| 36 |
+
File "/tmp/.cache/uv/environments-v2/e8b6185b99c36290/lib/python3.13/site-packages/transformers/models/auto/auto_factory.py", line 607, in from_pretrained
|
| 37 |
+
raise ValueError(
|
| 38 |
+
...<2 lines>...
|
| 39 |
+
)
|
| 40 |
+
ValueError: Unrecognized configuration class <class 'transformers_modules.Salesforce.CoDA_hyphen_v0_hyphen_Instruct.89d646aff12a3c17079eadba13bd784e74337b41.model_config.CoDAConfig'> for this kind of AutoModel: AutoModelForCausalLM.
|
| 41 |
+
Model type should be one of ApertusConfig, ArceeConfig, AriaTextConfig, BambaConfig, BartConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BioGptConfig, BitNetConfig, BlenderbotConfig, BlenderbotSmallConfig, BloomConfig, BltConfig, CamembertConfig, LlamaConfig, CodeGenConfig, CohereConfig, Cohere2Config, CpmAntConfig, CTRLConfig, Data2VecTextConfig, DbrxConfig, DeepseekV2Config, DeepseekV3Config, DiffLlamaConfig, DogeConfig, Dots1Config, ElectraConfig, Emu3Config, ErnieConfig, Ernie4_5Config, Ernie4_5_MoeConfig, Exaone4Config, FalconConfig, FalconH1Config, FalconMambaConfig, FlexOlmoConfig, FuyuConfig, GemmaConfig, Gemma2Config, Gemma3Config, Gemma3TextConfig, Gemma3nConfig, Gemma3nTextConfig, GitConfig, GlmConfig, Glm4Config, Glm4MoeConfig, GotOcr2Config, GPT2Config, GPT2Config, GPTBigCodeConfig, GPTNeoConfig, GPTNeoXConfig, GPTNeoXJapaneseConfig, GptOssConfig, GPTJConfig, GraniteConfig, GraniteMoeConfig, GraniteMoeHybridConfig, GraniteMoeSharedConfig, HeliumConfig, HunYuanDenseV1Config, HunYuanMoEV1Config, JambaConfig, JetMoeConfig, Lfm2Config, LlamaConfig, Llama4Config, Llama4TextConfig, LongcatFlashConfig, MambaConfig, Mamba2Config, MarianConfig, MBartConfig, MegaConfig, MegatronBertConfig, MiniMaxConfig, MinistralConfig, MistralConfig, MixtralConfig, MllamaConfig, ModernBertDecoderConfig, MoshiConfig, MptConfig, MusicgenConfig, MusicgenMelodyConfig, MvpConfig, NemotronConfig, OlmoConfig, Olmo2Config, Olmo3Config, OlmoeConfig, OpenLlamaConfig, OpenAIGPTConfig, OPTConfig, PegasusConfig, PersimmonConfig, PhiConfig, Phi3Config, Phi4MultimodalConfig, PhimoeConfig, PLBartConfig, ProphetNetConfig, QDQBertConfig, Qwen2Config, Qwen2MoeConfig, Qwen3Config, Qwen3MoeConfig, Qwen3NextConfig, RecurrentGemmaConfig, ReformerConfig, RemBertConfig, RobertaConfig, RobertaPreLayerNormConfig, RoCBertConfig, RoFormerConfig, RwkvConfig, SeedOssConfig, SmolLM3Config, Speech2Text2Config, StableLmConfig, Starcoder2Config, TransfoXLConfig, TrOCRConfig, VaultGemmaConfig, WhisperConfig, XGLMConfig, XLMConfig, XLMProphetNetConfig, XLMRobertaConfig, XLMRobertaXLConfig, XLNetConfig, xLSTMConfig, XmodConfig, ZambaConfig, Zamba2Config.
|
| 42 |
+
|
| 43 |
+
|
| 44 |
+
|