Hacker Newsnew | past | comments | ask | show | jobs | submit | fromlogin
The most reliable AI agent that works – where Claude, Gemini, and o3 fail (recursal.ai)
1 point by djshah 6 months ago | past
Qwerky: Attention is not what you need? RWKV mashed into QwQ models (recursal.ai)
3 points by vessenes 8 months ago | past | 1 comment
Attention is NOT all you need: Qwerky-72B trained using only 8 AMD MI300X GPUs (recursal.ai)
20 points by jtatarchuk 8 months ago | past | 3 comments
Training large attention free models (recursal.ai)
1 point by smusamashah 8 months ago | past
Qwerky 72B – A 72B LLM without transformer attention (recursal.ai)
4 points by pico_creator 8 months ago | past
Please stop throwing money at AI founders with no commercial plan, besides AGI (recursal.ai)
1 point by tosh on April 9, 2024 | past
EagleX 1.7T: Soaring past LLaMA 7B 2T in both English and Multi-lang evals (recursal.ai)
36 points by lhofer on March 16, 2024 | past | 9 comments

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: