API Documentation#
API reference for the neurovlm module.
Table of Contents#
Inference#
The high-level interface for inference.
|
Unified interface for text-to-brain and brain-to-text. |
|
Container for brain retrieval or generation outputs. |
|
Container for text retrieval scores and metadata. |
|
Chainable wrapper around a top-k brain result table. |
Data#
Fetches from huggingface and loads.
Fetching#
|
Fetch NeuroVLM data from Hugging Face repositories. |
|
Alias to _load_* functions in retrieval resources. |
Embeddings#
Pre-computed latent vectors for text and neuroimages.
|
Alias to _load_latent* functions in retrieval resources. |
Masker#
Nifti masker need to resample and mask neuroimages.
Masker alias. |
Models#
Base models for autoencoder, projection heads, and specter. Pretrained models return from load_model or calling .from_pretrained on model classes.
|
Autoencoder for neuro-vectors. |
|
Align latent tensors. |
|
Wrapper for Specter model. |
|
Alias to .from_pretrained methods in model classes. |
Loss Functions#
The pretrained models used InfoNCELoss or MSELoss. Additional options include FocalLoss or TruncatedLoss.
|
Compute InfoNCE loss between image and text embeddings. |
|
|
|
Training#
Convenience wrapper for training: a standard PyTorch training loop.
|
Training loop. |
Determine the device to move models and tensors to. |
Metrics#
Performance metrics.
|
|
|
|
|
Compute dice score. |
|
Compute dice score of top k. |
|
Elementwise Bernoulli negative log-likelihood (cross-entropy), in nats. |
|
y_true: (N, D) floats in [0,1] logits: (N, D) raw logits from decoder (before sigmoid) |