Skip to content Skip to footer
0 items - $0.00 0

Integrating Local Open LLMs (LLM-Jp) with MLflow Prompt Engineering UI by ss-13

Integrating Local Open LLMs (LLM-Jp) with MLflow Prompt Engineering UI by ss-13

Integrating Local Open LLMs (LLM-Jp) with MLflow Prompt Engineering UI by ss-13

1 Comment

  • Post Author
    ss-13
    Posted April 20, 2025 at 6:37 pm

    I’ve been experimenting with MLflow’s Prompt Engineering UI, which lets you do no-code prompt tuning across multiple LLMs. While it officially supports models like OpenAI out of the box, I wanted to try it with Japanese open-source models from the LLM-jp project.

    This repo shows how to serve these models locally using MLflow’s pyfunc model interface, expose them via the MLflow AI Gateway, and compare prompt performance through the UI.

    It includes a working setup with:
    – Hugging Face LLM-jp models (e.g. llm-jp-3-3.7b-instruct)
    – MLflow Model Serving
    – MLflow Gateway
    – Prompt Engineering UI
    – Streamlit UI for experiment tracking

    GitHub: https://github.com/suzuki-2001/mlflow-llm-jp-integration
    Japanese article explaining the project: https://zenn.dev/shosuke_13/articles/21d304b5f80e00

Leave a comment

In the Shadows of Innovation”

© 2025 HackTech.info. All Rights Reserved.

Sign Up to Our Newsletter

Be the first to know the latest updates

Whoops, you're not connected to Mailchimp. You need to enter a valid Mailchimp API key.