Knowledge Base
Webhook Job Types
Webhook Job Types
This page provides a comprehensive reference of all job types that can be monitored through webhook subscriptions in the Narrative platform.
Overview
Job types represent different categories of work performed by the Narrative platform. When creating webhook subscriptions, you can specify which job types to monitor to receive notifications for relevant events.
Available Job Types
Data Processing Jobs
materialize-view
- Description: Materialized view creation and refresh jobs triggered by dataset creation, updates, or scheduled refreshes
- Common use case: Monitor when your datasets are updated and ready for use
forecast
- Description: Legacy forecast jobs that approximate row counts for query definitions
- Common use case: Running legacy forecasts to estimate row counts
nql-forecast
- Description: NQL-based forecast jobs that estimate rows matching NQL query definitions
- Common use case: Monitor NQL query planning and estimation jobs
costs
- Description: Cost calculation jobs triggered by data processing estimation requests
- Common use case: Track completion of cost calculations for budgeting
Dataset Management Jobs
datasets_sample
- Description: Dataset sampling jobs triggered when requesting preview samples of datasets
- Common use case: Monitor when dataset samples are ready for review
datasets_calculate_column_stats
- Description: Column statistics calculation jobs triggered by data profiling requests
- Common use case: Track completion of data analysis
datasets_delete_table
- Description: Dataset table deletion jobs triggered when cleaning up data resources
- Common use case: Confirm successful cleanup of data resources
datasets_suggest_mappings
- Description: Mapping suggestion jobs triggered by automated data integration processes
- Common use case: Monitor completion of data mapping recommendations
datasets_deliver_data
- Description: Data delivery jobs triggered when connectors deliver data to external endpoints
- Common use case: Track when your data has been successfully delivered to external platforms
delete-snowflake-table
- Description: Snowflake table deletion jobs triggered when cleaning up Snowflake resources
- Common use case: Confirm successful deletion of Snowflake tables
upload-stats
- Description: Statistics upload jobs triggered when synchronizing statistical data and metrics
- Common use case: Monitor completion of statistics synchronization
Model and AI Jobs
model_training_run
- Description: LLM model training jobs triggered when starting fine-tuned training processes
- Common use case: Monitor long-running ML model training jobs
models_deliver_model
- Description: Model delivery jobs triggered when deploying trained models to production data planes
- Common use case: Track model deployment completion
Job Input Parameters
Different job types include various input parameters. Common parameters include:
Dataset Jobs (materialize-view
, datasets_*
)
dataset_id
: ID of the dataset being processeddataset_name
: Name of the datasetnql
: NQL query being executedfirst_run
:true
for the first execution of a dataset's job,false
for subsequent runscreate_as_view
:true
to create a Snowflake view instead of a table/materialized viewstats_enabled
: Whether statistics collection is enabled
Forecast Jobs (forecast
, nql-forecast
, costs
)
- Query or NQL definition being forecasted
- Parameters for estimation accuracy
- Target data sources
Model Jobs (model_training_run
, models_deliver_model
)
- Model configuration parameters
- Training data references
- Target deployment environment
Job States
All job types progress through these states:
pending
: Job has been queued but not yet startedrunning
: Job is currently executingcompleted
: Job has finished successfullypending_cancellation
: Job is being cancelledcancelled
: Job was cancelled before completionfailed
: Job encountered an error and could not complete
Next Steps
App Events Reference
Now explore connector events you can monitor, from Facebook to Yahoo! and other platform integrations.