Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
underlines
on Oct 8, 2024
|
parent
|
context
|
favorite
| on:
Longwriter – Increase llama3.1 output to 10k words
vLLM is simple to set up, use docker and make sure your backend (ubuntu or WSL ubuntu or whatever) has GPU support installed.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: