You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm excited to give this a try! I wrote a simple script to pass in a URL and use your library to annotate the PDF found there. However, I don't have a GPU available and it seems that there isn't currently a way to provide that information to the Annotate constructor or to the annotate call. I'm getting the following error:
python annotate_paper.py -u https://arxiv.org/abs/1807.05520
Traceback (most recent call last):
File "/home/eswens13/test/annotate/annotate_paper.py", line 35, in <module>
annotate(args.url, model=args.model)
File "/home/eswens13/test/annotate/annotate_paper.py", line 7, in annotate
annotate = Annotate(model)
^^^^^^^^^^^^^^^
File "/home/eswens13/.local/lib/python3.12/site-packages/annotateai/annotate.py", line 34, in __init__
self.llm = LLM(llm)
^^^^^^^^
File "/home/eswens13/.local/lib/python3.12/site-packages/txtai/pipeline/llm/llm.py", line 39, in __init__
self.generator = GenerationFactory.create(path, method, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/eswens13/.local/lib/python3.12/site-packages/txtai/pipeline/llm/factory.py", line 41, in create
return HFGeneration(path, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/eswens13/.local/lib/python3.12/site-packages/txtai/pipeline/llm/huggingface.py", line 26, in __init__
self.llm = HFLLM(path, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/eswens13/.local/lib/python3.12/site-packages/txtai/pipeline/llm/huggingface.py", line 39, in __init__
super().__init__(self.task(path, task, **kwargs), path, quantize, gpu, model, **kwargs)
File "/home/eswens13/.local/lib/python3.12/site-packages/txtai/pipeline/hfpipeline.py", line 56, in __init__
self.pipeline = pipeline(task, model=path, device=device, model_kwargs=modelargs, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/eswens13/.local/lib/python3.12/site-packages/transformers/pipelines/__init__.py", line 940, in pipeline
framework, model = infer_framework_load_model(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/eswens13/.local/lib/python3.12/site-packages/transformers/pipelines/base.py", line 289, in infer_framework_load_model
model = model_class.from_pretrained(model, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/eswens13/.local/lib/python3.12/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained
return model_class.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/eswens13/.local/lib/python3.12/site-packages/transformers/modeling_utils.py", line 3669, in from_pretrained
hf_quantizer.validate_environment(
File "/home/eswens13/.local/lib/python3.12/site-packages/transformers/quantizers/quantizer_awq.py", line 71, in validate_environment
raise RuntimeError(
RuntimeError: GPU is required to run AWQ quantized model. You can use IPEX version AWQ if you have an Intel CPU
I have downloaded IPEX and the version of AWQ that supports it, but I can't get around this error. I believe the issue is that the Annotateconstructor calls the LLM constructor with no kwargs (e.g. where I could set gpu=False).
I'm running Fedora 40 and, again, no GPU available. Any help is appreciated.
The text was updated successfully, but these errors were encountered:
Hi,
I'm excited to give this a try! I wrote a simple script to pass in a URL and use your library to annotate the PDF found there. However, I don't have a GPU available and it seems that there isn't currently a way to provide that information to the
Annotate
constructor or to theannotate
call. I'm getting the following error:I have downloaded IPEX and the version of AWQ that supports it, but I can't get around this error. I believe the issue is that the
Annotate
constructor calls theLLM
constructor with no kwargs (e.g. where I could setgpu=False
).I'm running Fedora 40 and, again, no GPU available. Any help is appreciated.
The text was updated successfully, but these errors were encountered: