Replies: 1 comment 2 replies
-
|
Hi You can try a simple script like the one below to get started import rkllm #Load your specific rkllm model file #Define your desired model parameters #Generate a response from the model #Print the generated output rkllm.load_model('xxx.rkllm'): This is used to load your model file (in this case, xxx.rkllm). model.generate(...): This is the main function for running the model. You can provide a prompt and specify parameters like max_context_length and max_generate_length just like you would on the command line. You try frist! |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Sorry, I'm new to this....
Thank to the one-click script, I have successfully installed rkllm, and can run 'rkllm xxx.rkllm 4096 4096' with decent response.
However, I don't know if there's any python api, so I can build my own python script with it?
Beta Was this translation helpful? Give feedback.
All reactions