Loading
0 Follower
0 Following
yikuan_xia

Badges

0
0
0

Activity

Dec
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
Mon
Wed
Fri

Ratings Progression

Loading...

Challenge Categories

Loading...

Challenges Entered

Improve RAG with Real-World Benchmarks

Latest submissions

No submissions made in this challenge.

Testing RAG Systems with Limited Web Pages

Latest submissions

No submissions made in this challenge.
Participant Rating
Participant Rating
  • db3 Meta Comprehensive RAG Benchmark: KDD Cup 2024
    View

Meta Comprehensive RAG Benchmark: KDD Cup 2-9d1937

🚨 IMP: Phase 2 Announcement

7 months ago

what does api update mean, is the test environment updated to the new api?

Issues about submission LFS file issues

8 months ago

During our last submission, we successfully go throught the docker building process. And there’s an error during the inference stage when we are attempting to load the large model:

File β€œ/src/models/dummy_model.py”, line 86, in init

File β€œ/src/models/dummy_model.py”, line 86, in init

self.m = LlamaForCausalLM.from_pretrained(model, device_map="balanced",

self.m = LlamaForCausalLM.from_pretrained(model, device_map="balanced",

File β€œ/home/aicrowd/.conda/lib/python3.8/site-packages/transformers/modeling_utils.py”, line 3531, in from_pretrained

File β€œ/home/aicrowd/.conda/lib/python3.8/site-packages/transformers/modeling_utils.py”, line 3531, in from_pretrained

) = cls._load_pretrained_model(

) = cls._load_pretrained_model(

File β€œ/home/aicrowd/.conda/lib/python3.8/site-packages/transformers/modeling_utils.py”, line 3938, in _load_pretrained_model

File β€œ/home/aicrowd/.conda/lib/python3.8/site-packages/transformers/modeling_utils.py”, line 3938, in _load_pretrained_model

state_dict = load_state_dict(shard_file, is_quantized=is_quantized)

state_dict = load_state_dict(shard_file, is_quantized=is_quantized)

File β€œ/home/aicrowd/.conda/lib/python3.8/site-packages/transformers/modeling_utils.py”, line 542, in load_state_dict

File β€œ/home/aicrowd/.conda/lib/python3.8/site-packages/transformers/modeling_utils.py”, line 542, in load_state_dict

raise OSError(

raise OSError(

OSError: You seem to have cloned a repository without having git-lfs installed. Please install git-lfs and run git lfs install followed by git lfs pull in the folder you cloned.

OSError: You seem to have cloned a repository without having git-lfs installed. Please install git-lfs and run git lfs install followed by git lfs pull in the folder you cloned.

In the website page of our repository, the model files are correctly uploaded with an LFS tag on the left hand. The log message suggest installing git-lfs. So we made another try.

We add β€œgit-lfs” in the original β€œapt.txt”, but this time, it doesn’t go through the docker buiding stage.

How can I fix this issue?

Another question is that: what’s the submission number limitations now. Is it 6 times a week, and does that include a failure submission?

yikuan_xia has not provided any information yet.