# prompt2model
**Repository Path**: mirrors/prompt2model
## Basic Information
- **Project Name**: prompt2model
- **Description**: Prompt2Model 是一个采用自然语言任务描述(如 ChatGPT 等 LLM 使用的提示)来训练有利于部署的小型专用模型的系统
- **Primary Language**: Python
- **License**: Apache-2.0
- **Default Branch**: main
- **Homepage**: https://www.oschina.net/p/prompt2model
- **GVP Project**: No
## Statistics
- **Stars**: 2
- **Forks**: 0
- **Created**: 2023-08-29
- **Last Updated**: 2025-12-13
## Categories & Tags
**Categories**: Uncategorized
**Tags**: None
## README
# Prompt2Model - Generate Deployable Models from Instructions
[](https://badge.fury.io/py/prompt2model)

[](https://lbesson.mit-license.org/)
[](https://discord.gg/UCy9csEmFc)
[](https://colab.research.google.com/github/neulab/prompt2model/blob/main/prompt2model_demo.ipynb)
`Prompt2Model` is a system that takes a natural
language task description (like the prompts used for
LLMs such as ChatGPT) to train a small
special-purpose model that is conducive for deployment.
## Quick Start
### Notebook
You can run our demo of `Prompt2Model` through a notebook:
- [Open Locally](./prompt2model_demo.ipynb)
- [Open in Colab](https://colab.research.google.com/github/neulab/prompt2model/blob/main/prompt2model_demo.ipynb)
### Command Line
You can also run through the command line.
```bash
pip install prompt2model
```
`Prompt2Model` supports various platforms such as OpenAI, Anthropic, Huggingface, etc. using [LiteLLM](https://github.com/BerriAI/litellm).
If you are using OpenAI models (such as the default `gpt-3.5-turbo`), please obtain an
OpenAI API key on their [website](https://platform.openai.com/) then set
the environment variable `OPENAI_API_KEY` to your API key by running
the following command in your terminal:
```bash
export OPENAI_API_KEY=
```
[List of all supported providers](https://docs.litellm.ai/docs/providers)
You can then run
```bash
python prompt2model_demo.py
```
to create a small model from a prompt, as shown in
the demo video below. This script must be run on a
device with an internet connection to access the OpenAI
API. For best results, run
this script on a device with a GPU for training
your model.
## Demo
## Tips and Examples to Write a Good Prompt
You can see the tips and examples to write
a good prompt in [prompt_examples](./prompt_examples.md).
## Components
The `prompt2model` package is composed
of several components, each designed
to fulfill a specific purpose. To gain
a comprehensive understanding of how to
utilize each component effectively,
please consult the `readme.md` file
situated in the directory of the respective
component. These files can be found at
`./prompt2model//readme.md`.
They provide detailed information and
instructions on customizing and maximizing
the functionality of each
component within the package.
## Contribution
If you're interested in contributing to the `prompt2model` project, please
- refer to [CONTRIBUTING.md](CONTRIBUTING.md)
- open an [issue](https://github.com/neulab/prompt2model/issues) or submit a PR
- join us on [discord](https://discord.gg/UCy9csEmFc)
- or reach out to [@vijaytarian](https://twitter.com/vijaytarian)
and [@Chenan3_Zhao](https://twitter.com/Chenan3_Zhao) on Twitter
## Cite
We have [written a paper describing Prompt2Model in detail](https://arxiv.org/abs/2308.12261).
If you use Prompt2Model in your research, please cite us!
If you discuss or use the overall prompt2model framework, please reference
```bibtex
@misc{prompt2model,
title={Prompt2Model: Generating Deployable Models from Natural Language Instructions},
author={Vijay Viswanathan and Chenyang Zhao and Amanda Bertsch and Tongshuang Wu and Graham Neubig},
year={2023},
eprint={2308.12261},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
If you discuss or use our dataset retrieval and transformation tools, please reference
```bibtex
@misc{prompt2modeldatatune,
title={Better Synthetic Data by Retrieving and Transforming Existing Datasets},
author={Saumya Gandhi and Ritu Gala and Vijay Viswanathan and Tongshuang Wu and Graham Neubig},
year={2024},
eprint={2404.14361},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```