Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added an option to let users make a call to any external API with a prompt #47

Merged
merged 7 commits into from
Nov 15, 2023

Conversation

rishsriv
Copy link
Member

@rishsriv rishsriv commented Nov 14, 2023

This will be useful if - say - one wants to test multiple or inference strategies on a LLM that is hosted on a server. Currently, this can be used to send requests to a hosted vLLM server.

I tested this with python main.py -g api -q data/questions_gen.csv -b 5 -f prompts/prompt.md --url MY_VLLM_SERVER_URL -o results/results.csv

Copy link
Collaborator

@wongjingping wongjingping left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm, would you also like to add an entry in the README with the sample command you used to run it?

"best_of": num_beams,
"temperature": 0,
"stop": [";", "```"],
"max_tokens": 600,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: are we able to set return_full_text in the request params here? if not we can remove the comment above.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah we're not, removed!

@rishsriv
Copy link
Member Author

Will do!

@rishsriv rishsriv merged commit d76237e into main Nov 15, 2023
1 check passed
@rishsriv rishsriv deleted the rishabh/api branch November 15, 2023 04:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants