Skip to content

First Pipeline Walkthrough

Let’s create a sample pipeline that:

  1. Ingests data from an S3 bucket
  2. Cleanses and deduplicates the data
  3. Loads the output to Snowflake
  4. Sends a notification on success

Step 1: Create a Project

Go to Projects → Create New Project. Name it something like demo-etl.

Step 2: Define Your Job

Navigate to Manage Jobs and create a new job.

  • Choose the project
  • Set job name and description
  • Set retry policy and SLA if needed

Step 3: Add Tasks

Use the drag-and-drop builder:

  • Task 1: Data Ingestion (S3 to staging table)
  • Task 2: Data Cleansing (apply cleansing rules)
  • Task 3: Data Deduplication (based on email/phone)
  • Task 4: Load to Snowflake
  • Task 5: Send Email Notification

Configure each task using its UI panel.

Step 4: Test and Trigger

Click Run Job → Ad-hoc Trigger. Monitor logs and task execution.

Once successful, you'll see output tables in your destination and audit logs in the dashboard.


🚀 You’ve built your first end-to-end pipeline with Qualiz!