Hierarchical Neural Process for Pharmacokinetic Data
Overview
An Amortized Context Neural Process Generative model for Pharmacokinetic Modelling
Model details:
- Authors: César Ojeda (@cesarali)
- License: Apache 2.0
Intended use
Sample Drug Concentration Behavior and Sample and Prediction of New Points or new Individual
Runtime Bundle
This repository is the consumer-facing runtime bundle for this PK model.
- Runtime repo:
cesarali/AICME-runtime - Native training/artifact repo:
cesarali/AICMEPK_cluster - Supported tasks:
generate,predict - Default task:
generate - Load path:
AutoModel.from_pretrained(..., trust_remote_code=True)
Installation
You do not need to install sim_priors_pk to use this runtime bundle.
transformers is the public loading entrypoint, but transformers alone is
not sufficient because this is a PyTorch model with custom runtime code. A
reliable consumer environment is:
pip install torch transformers huggingface_hub lightning datasets pandas torchtyping gpytorch pot torchdiffeq torchsde ruamel.yaml pyyaml
Python Usage
from transformers import AutoModel
model = AutoModel.from_pretrained("cesarali/AICME-runtime", trust_remote_code=True)
studies = [
{
"context": [
{
"name_id": "ctx_0",
"observations": [0.2, 0.5, 0.3],
"observation_times": [0.5, 1.0, 2.0],
"dosing": [1.0],
"dosing_type": ["oral"],
"dosing_times": [0.0],
"dosing_name": ["oral"],
}
],
"target": [],
"meta_data": {"study_name": "demo", "substance_name": "drug_x"},
}
]
outputs = model.run_task(
task="generate",
studies=studies,
num_samples=4,
)
print(outputs["results"][0]["samples"])
Predictive Sampling
from transformers import AutoModel
model = AutoModel.from_pretrained("cesarali/AICME-runtime", trust_remote_code=True)
predict_studies = [
{
"context": [
{
"name_id": "ctx_0",
"observations": [0.2, 0.5, 0.3],
"observation_times": [0.5, 1.0, 2.0],
"dosing": [1.0],
"dosing_type": ["oral"],
"dosing_times": [0.0],
"dosing_name": ["oral"],
}
],
"target": [
{
"name_id": "tgt_0",
"observations": [0.25, 0.31],
"observation_times": [0.5, 1.0],
"remaining": [0.0, 0.0, 0.0],
"remaining_times": [2.0, 4.0, 8.0],
"dosing": [1.0],
"dosing_type": ["oral"],
"dosing_times": [0.0],
"dosing_name": ["oral"],
}
],
"meta_data": {"study_name": "demo", "substance_name": "drug_x"},
}
]
outputs = model.run_task(
task="predict",
studies=predict_studies,
num_samples=4,
)
print(outputs["results"][0]["samples"][0]["target"][0]["prediction_samples"])
Notes
trust_remote_code=Trueis required because this model uses custom Hugging Face Hub runtime code.- The consumer API is
transformers+run_task(...); the consumer does not need a local clone of this repository. - This runtime bundle is intentionally separate from the native training export so you can evaluate both distribution paths in parallel.
- Downloads last month
- 11
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support