- Issue created by @bogdog400
- π©πͺGermany marcus_johansson
@bogdog400 It should be that token. It should be starting with "hf_" if its a correct token.
I can't really replicate this, in worst case could you mail me an example token that I could try to replicate it with? You could mail on huggingface@marcusmailbox.com. As soon as I have tested it you can revoke the token.
- Status changed to Postponed
6 months ago 11:43am 12 June 2024 - π©π°Denmark ressa Copenhagen
I was trying to use some of the
meta-llama/Meta-Llama
models (and some other ones) and also got the same error, and other warnings. In the end I got it working with these steps:- Create a Token
- Update permissions under "Access Tokens > YOUR_TOKEN > Manage > Edit Permissions" and set these values:
Inference x Make calls to the serverless Inference API x Make calls to Inference Endpoints Manage Inference Endpoints Repos x Read access to contents of all public gated repos you can access [...]
- I got the error
The model meta-llama/Meta-Llama-3-8B is too large to be loaded automatically
, after changing tometa-llama/Meta-Llama-3-8B-Instruct
it worked