Meeting “mellea” !!!

Just put my hands on “mellea”, open-source from IBM for generative programming.

Introduction-What is mellea

Open source AI models are losing to ChatGPT because they lack the polished infrastructure that makes closed platforms easy to use….


This content originally appeared on DEV Community and was authored by Alain Airom

Just put my hands on “mellea”, open-source from IBM for generative programming.

Introduction-What is mellea

Open source AI models are losing to ChatGPT because they lack the polished infrastructure that makes closed platforms easy to use. IBM’s new library, Mellea, aims to fix that.

mellea is a library for writing generative programs 👍.

What mellea Does?

Excerpt of mellea’s documentation;

This project takes us back to the future of computing by formally introducing the concept of generative programs — software systems that strategically integrate calls to Large Language Models (LLMs) — and the demanding engineering required to make them reliable. The fundamental challenge we address is how to safely and predictably harness the powerful but inherently stochastic operations of LLMs within traditionally deterministic codebases. This documentation establishes a rigorous framework, emphasizing core techniques like requirement verification to circumscribe periods of non-determinism, mechanisms for repairing failure traces, and advanced context management. Ultimately, this work outlines essential principles and architectural patterns needed to construct robust, high-confidence generative software that effectively merges the capabilities of LLMs with reliable computational predictability.

Excerpt of mellea’s documentation;

In classical programming, pure (stateless) functions are a simple and powerful abstraction. A pure function takes inputs, computes outputs, and has no side effects. Generative programs can also use functions as abstraction boundaries, but in a generative program the meaning of the function can be given by an LLM instead of an interpreter or compiler. This is the idea behind a GenerativeSlot.
A GenerativeSlot is a function whose implementation is provided by an LLM. In Mellea, you define these using the @generative decorator. The function signature specifies the interface, and the docstring (or type annotations) guide the LLM in producing the output.
Example of a sentiment classifier, where classify_sentiment is a GenerativeSlot: it looks like a normal function, but its implementation is handled by the LLM. The type annotation (Literal["positive", "negative"]) constrains the output, and the prompt is automatically constructed from the function signature and docstring.

# file: https://github.com/generative-computing/mellea/blob/main/docs/examples/tutorial/sentiment_classifier.py#L1-L13
from typing import Literal
from mellea import generative, start_session

@generative
def classify_sentiment(text: str) -> Literal["positive", "negative"]:
  """Classify the sentiment of the input text as 'positive' or 'negative'."""
  ...

m = start_session()
sentiment = classify_sentiment(m, text="I love this!")
print("Output sentiment is:", sentiment)

What is mellea’s Architecture?

Mellea’s core abstraction is called a Component. A Component is a structured object that represents a unit of interaction with an LLM. The Mellea stdlib contains a set of useful components, but you can also define your own. We have already seen some components — Instruction and Requirement are both Components.Components are composite data structures; that is, a Component can be made up of many other parts. Each of those parts is either a CBlock or another Component. CBlocks, or “content blocks”, are an atomic unit of text or data. CBlocks hold raw text (or sometimes parsed representations) and can be used as leaves in the Component DAG.Backends are the engine that actually run the LLM. Backends consume Components, format the Component, pass the formatted input to an LLM, and return model outputs, which are then parsed back into CBlocks or Components.During the course of an interaction with an LLM, several Components and CBlocks may be created. Logic for handling this trace of interactions is provided by a Context object. Some book-keeping needs to be done in order for Contexts to approporiately handle a trace of Components and CBlocks. The MelleaSession class, which is created by mellea.start_session(), does this book-keeping a simple wrapper around Contexts and Backends.When we call m.instruct(), the MelleaSession.instruct method creates a component called an Instruction. Instructions are part of the Mellea standard library.

So in other words, mellea enables users to build safe and trustable generative (ai) applications 🙌.

Tests and Usage

mellea works with Ollama, if you want to try it, you’ll need to install Ollama locally depending on different code implementations, the required model(s) is/are downloaded by Ollama.

I just reused and modified a bit (adding formatted markdown/timestamped files as output) 3 of the sample applications provided, which I share below. The results are outstanding ☄️.

Email Generator

# it works best with uv
uv venv mellea-env
source mellea-env/bin/activate

uv pip install mellea
# SimpleEmailApp.py
import mellea
import os 
from datetime import datetime 

# --- Configuration ---
APP_NAME = "SimpleEmailApp"
OUTPUT_DIR = "./output"
# ---------------------

def write_email(m: mellea.MelleaSession, name: str, notes: str) -> str:
  email = m.instruct(
    "Write an email to {{name}} using the notes following: {{notes}}.",
    user_variables={"name": name, "notes": notes},
  )
  return email.value


os.makedirs(OUTPUT_DIR, exist_ok=True)

m = mellea.start_session()
email_content = write_email(m, "Alain",
                  "Alain helped the lab over the last few weeks by organizing intern events, advertising the speaker series, and handling issues with snack delivery.")

timestamp = datetime.now().strftime("%Y-%m-%d %H:%M:%S")

# Create the markdown-formatted, timestamped string
output_string = (
    f"# {APP_NAME} Output\n"
    f"\n"
    f"**Timestamp**: {timestamp}\n"
    f"\n"
    f"***\n"
    f"\n"
    f"## Generated Email\n"
    f"\n"
    f"```
{% endraw %}
\n"
    f"{email_content}\n"
    f"
{% raw %}
```\n"
)

# 4. Write the output to a file
filename = datetime.now().strftime("email_output_%Y%m%d_%H%M%S.md")
file_path = os.path.join(OUTPUT_DIR, filename)

with open(file_path, "w") as f:
    f.write(output_string)

print(f"✅ Output successfully written to {file_path}")
SimpleEmailApp Output

**Timestamp**: 2025-11-05 11:00:53

***

Generated Email

markdown
Subject: Appreciation for Your Valuable Contributions in the Lab

Dear Alain,

I hope this message finds you well. I am writing to express our gratitude for your significant contributions to the lab over the past few weeks.

Your efforts have been instrumental in organizing several intern events, which provided an excellent platform for collaboration and knowledge sharing among our team members. The success of these events is a testament to your dedication and organizational skills.

Additionally, your role in advertising the speaker series was commendable. Your proactive approach has helped us reach a wider audience, thereby contributing to increased participation and engagement. This has not only enriched our internal discussions but also provided valuable insights for future research directions.

Lastly, handling the issues with snack delivery has been crucial for maintaining morale and productivity within the lab. Your commitment to ensuring that every team member is well-fed during these events speaks volumes about your dedication to our collective wellbeing.

Your hard work and professionalism have greatly enhanced our working environment and contributed significantly towards achieving our shared goals. We are incredibly grateful for your support and contributions.

Once again, thank you for your excellent service. Please feel free to reach out if there's anything we can do to return the favor or support you in any way.

Best Regards,

[Your Name]
[Your Position]

ℹ️ The code downloaded ibm/granite4:micro.

Document Processor with Docling

For this code, Docling package is needed.

uv pip install "mellea[docling]"
# doc_processor_app.py
import mellea
import os
import sys
from datetime import datetime
from mellea.stdlib.docs.richdocument import RichDocument, Table
from mellea.backends.types import ModelOption

# --- Configuration ---
APP_NAME = "MelleaDocProcessor"
INPUT_DIR = "./input"
OUTPUT_DIR = "./output"
# ---------------------

def get_input_file_path() -> str:
    """Finds the first PDF file in the INPUT_DIR."""

    if not os.path.exists(INPUT_DIR):
        print(f"🚨 Error: Input directory '{INPUT_DIR}' not found.")
        print("Please create the folder and place a PDF file inside.")
        sys.exit(1)

    pdf_files = [f for f in os.listdir(INPUT_DIR) if f.lower().endswith(".pdf")]

    if not pdf_files:
        print(f"🚨 Error: No PDF files found in '{INPUT_DIR}'.")
        print("Please place a PDF file in the folder to process.")
        sys.exit(1)

    input_filename = pdf_files[0]
    return os.path.join(INPUT_DIR, input_filename)


def get_rich_document(file_path: str) -> RichDocument:
    """Loads the document from the specified file path."""
    print(f"Loading document from: {file_path}")
    try:
        return RichDocument.from_document_file(file_path)
    except Exception as e:
        print(f"Error loading document: {e}")
        print("Ensure the PDF file is valid and Mellea's dependencies (Ollama) are running.")
        sys.exit(1)

def transform_document(rd: RichDocument) -> str:
    """Extracts the first table and uses Mellea to transform it."""

    tables = rd.get_tables()
    if not tables:
        return "Error: No tables found in the document."

    table1: Table = tables[0]

    markdown_output = "## Original Table Extracted\n"
    markdown_output += table1.to_markdown() + "\n\n"

    m = mellea.start_session()

    markdown_output += "## Transformed Table Attempts\n"

    for seed in [x * 12 for x in range(5)]:
        markdown_output += f"**Attempt with Seed: {seed}**\n"
        print(f"--- Attempting transformation with seed: {seed} ---")

        table2 = m.transform(
            table1,
            "Add a column 'Model' that extracts which model was used or 'None' if none.",
            model_options={ModelOption.SEED: seed},
        )

        if isinstance(table2, Table):
            markdown_output += table2.to_markdown() + "\n\n"
            print("✅ Transformation successful.")
            break
        else:
            markdown_output += "==== TRYING AGAIN after non-useful output.====\n\n"
            print("==== TRYING AGAIN after non-useful output.====")

    return markdown_output

def main():
    """Main function to run the application logic."""

    # 1. Setup folders and find input file
    os.makedirs(OUTPUT_DIR, exist_ok=True)
    input_file_path = get_input_file_path() # Dynamic discovery

    # 2. Process the document
    rd = get_rich_document(input_file_path)
    output_content = transform_document(rd)

    # 3. Format and write the final output
    timestamp = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
    input_filename_base = os.path.basename(input_file_path)

    # Create the final markdown file content
    final_output_string = (
        f"# {APP_NAME} Output\n"
        f"\n"
        f"**Source File**: `{input_filename_base}`\n"
        f"**Timestamp**: {timestamp}\n"
        f"\n"
        f"***\n"
        f"\n"
        f"{output_content}"
    )

    # Use a timestamped filename for uniqueness
    filename = datetime.now().strftime(f"{APP_NAME}_output_%Y%m%d_%H%M%S.md")
    file_path = os.path.join(OUTPUT_DIR, filename)

    with open(file_path, "w") as f:
        f.write(final_output_string)

    print(f"\n✅ All output successfully written to {file_path}")

if __name__ == "__main__":
    main()
MelleaDocProcessor Output

**Source File**: `1906.04043v1.pdf`
**Timestamp**: 2025-11-05 11:12:26

***

Original Table Extracted
Table 1: Cross-validated results of fake-text discriminators. Distributional information yield a higher informativeness than word-features in a logistic regression.

| Feature                              | AUC         |
|--------------------------------------|-------------|
| Bag of Words                         | 0.63 ± 0.11 |
| (Test 1 - GPT-2) Average Probability | 0.71 ± 0.25 |
| (Test 2 - GPT-2) Top-K Buckets       | 0.87 ± 0.07 |
| (Test 1 - BERT) Average Probability  | 0.70 ± 0.27 |
| (Test 2 - BERT) Top-K Buckets        | 0.85 ± 0.09 |

Transformed Table Attempts
**Attempt with Seed: 0**
| Feature                              | AUC         |
|--------------------------------------|-------------|
| Bag of Words                         | 0.63 ± 0.11 |
| (Test 1 - GPT-2) Average Probability | 0.71 ± 0.25 |
| (Test 2 - GPT-2) Top-K Buckets       | 0.87 ± 0.07 |
| (Test 1 - BERT) Average Probability  | 0.70 ± 0.27 |
| (Test 2 - BERT) Top-K Buckets        | 0.85 ± 0.09 |

Email Validator

Works fine with the packages already installed, but calls Mistral LLM (mistral:7b) which is downloaded.

# email_validate_app.py
import mellea
import os
from datetime import datetime
from mellea.backends.types import ModelOption
from mellea.backends.ollama import OllamaModelBackend
from mellea.backends import model_ids
from mellea.stdlib.sampling import RejectionSamplingStrategy

# --- Configuration ---
APP_NAME = "MelleaValidatedEmail"
OUTPUT_DIR = "./output"
# ---------------------

def generate_validated_email() -> str:
    """Creates a Mellea session and generates an email with validation/repair."""

    m = mellea.MelleaSession(
        backend=OllamaModelBackend(
            model_id=model_ids.MISTRALAI_MISTRAL_0_3_7B,
            model_options={ModelOption.MAX_NEW_TOKENS: 300},
        )
    )

    email_v1 = m.instruct(
        "Write an email to invite all interns to the office party.",
        requirements=["be formal", "Use 'Dear interns' as greeting."],
        strategy=RejectionSamplingStrategy(loop_budget=3),
    )

    return str(email_v1)

def write_output(email_content: str):
    """Formats and writes the output to a timestamped markdown file."""

    os.makedirs(OUTPUT_DIR, exist_ok=True)

    timestamp = datetime.now().strftime("%Y-%m-%d %H:%M:%S")

    output_string = (
        f"# {APP_NAME} Output\n"
        f"\n"
        f"**Timestamp**: {timestamp}\n"
        f"**Model Used**: `{model_ids.MISTRALAI_MISTRAL_0_3_7B}`\n"
        f"\n"
        f"***\n"
        f"\n"
        f"## Generated and Validated Email\n"
        f"The following output adheres to requirements: 'be formal' and 'Use 'Dear interns' as greeting'.\n"
        f"\n"
        f"```
{% endraw %}
text\n"
        f"{email_content}\n"
        f"
{% raw %}
```\n"
    )

    filename = datetime.now().strftime(f"{APP_NAME}_output_%Y%m%d_%H%M%S.md")
    file_path = os.path.join(OUTPUT_DIR, filename)

    with open(file_path, "w") as f:
        f.write(output_string)

    print(f"\n✅ Output successfully written to {file_path}")
    print(f"\n***** Generated Email Preview ****\n{email_content}\n*******")

if __name__ == "__main__":
    generated_email = generate_validated_email()
    write_output(generated_email)
MelleaValidatedEmail Output

**Timestamp**: 2025-11-05 11:29:13
**Model Used**: `ModelIdentifier(hf_model_name='mistralai/Mistral-7B-Instruct-v0.3', ollama_name='mistral:7b', watsonx_name=None, mlx_name=None, hf_tokenizer_name=None)`

***

Generated and Validated Email
The following output adheres to requirements: 'be formal' and 'Use 'Dear interns' as greeting'.

text
 Subject: Invitation to the Annual Office Party - Saturday, June 18th

Dear Interns,

I hope this message finds you well and enjoying your time with us at [Company Name].

It is my pleasure to extend an invitation to each and every one of our interns for our upcoming annual office party. This event is an opportunity for all employees to gather, celebrate, and strengthen the bonds that make up our unique work community.

The party will be held on Saturday, June 18th from 6:00 PM to 9:00 PM at [Location]. There will be food, drinks, music, games, and various other forms of entertainment. It promises to be a night filled with laughter, camaraderie, and fun!

Please RSVP by Friday, June 10th, to ensure we have an accurate headcount for catering and seating arrangements. If you have any dietary restrictions or specific accessibility needs, please include that information in your response as well.

We encourage all interns to attend, so please invite any friends or family members who may be visiting and would like to join the festivities.

This event is a testament to our appreciation for the hard work and dedication you bring to [Company Name] each day. We look forward to celebrating together and creating memories that will last long after your internship comes to an end

Et voilà 🍾

🌟 Conclusion: Mellea — The Framework for Reliable Generative Programs

mellea is not just another library; it is a rigorous framework that tackles the fundamental challenge of building reliable generative programs. It serves as the bridge between the powerful, yet stochastic nature of Large Language Models (LLMs) and the predictable, deterministic codebases we rely on.

Mellea achieves this reliability through a precise architectural approach centered on three core concepts:

  • The Component Abstraction: All interactions with an LLM are structured as Components (like Instruction or Requirement). These are composite data structures built from atomic text/data units called CBlocks, allowing for structured and traceable units of LLM interaction.
  • Context Management: The MelleaSession class—a simple wrapper around the Context object—is responsible for the essential book-keeping required to manage the entire trace of Components and CBlocks created during an interaction, providing a mechanism for handling and repairing failure traces.
  • Generative Abstraction: It formalizes advanced concepts like GenerativeSlots, extending the powerful abstraction of the pure function into the generative world. Here, the meaning and output of the function are determined not by a compiler, but by the LLM itself.

Ultimately, Mellea provides the essential principles and architectural patterns — emphasizing requirement verification and trace repair — necessary to construct robust, high-confidence generative software that effectively merges LLM capability with traditional computational predictability. It is the definitive tool for engineering the next generation of reliable, LLM-integrated software.

Links


This content originally appeared on DEV Community and was authored by Alain Airom


Print Share Comment Cite Upload Translate Updates
APA

Alain Airom | Sciencx (2025-11-05T13:05:51+00:00) Meeting “mellea” !!!. Retrieved from https://www.scien.cx/2025/11/05/meeting-mellea/

MLA
" » Meeting “mellea” !!!." Alain Airom | Sciencx - Wednesday November 5, 2025, https://www.scien.cx/2025/11/05/meeting-mellea/
HARVARD
Alain Airom | Sciencx Wednesday November 5, 2025 » Meeting “mellea” !!!., viewed ,<https://www.scien.cx/2025/11/05/meeting-mellea/>
VANCOUVER
Alain Airom | Sciencx - » Meeting “mellea” !!!. [Internet]. [Accessed ]. Available from: https://www.scien.cx/2025/11/05/meeting-mellea/
CHICAGO
" » Meeting “mellea” !!!." Alain Airom | Sciencx - Accessed . https://www.scien.cx/2025/11/05/meeting-mellea/
IEEE
" » Meeting “mellea” !!!." Alain Airom | Sciencx [Online]. Available: https://www.scien.cx/2025/11/05/meeting-mellea/. [Accessed: ]
rf:citation
» Meeting “mellea” !!! | Alain Airom | Sciencx | https://www.scien.cx/2025/11/05/meeting-mellea/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.