Browse 31 websites in this category • Page 1 of 2
Sites where Data Pipeline Tool is a core feature are shown first, marked with a Core badge.
Logstash lets you collect, parse, and transform data from any source, helping you streamline log management and data integration with ease.
Apache Airflow lets you build, schedule, and monitor workflows. Easily automate complex processes and manage data pipelines at scale.
Apache Flume helps you collect, aggregate, and move large amounts of log data reliably with a flexible, fault-tolerant streaming architecture.
Apache NiFi lets you automate, process, and move data between systems with an easy-to-use interface for building secure and reliable data pipelines.
Move data from over 140 sources to your database or warehouse in minutes with Stitch—no coding needed, fully automated, and cloud-based.
Luigi is a Python toolkit for building and managing complex batch pipelines, offering workflow automation, dependency handling, and clear documentation.
Apache Tez is an open-source framework for building complex data processing workflows on Hadoop, enabling efficient and flexible data pipelines.
Astronomer is a cloud platform for building, running, and monitoring data pipelines with Apache Airflow, making data workflows simple and reliable.
Apache Storm lets you process unbounded streams of data in real time. It's open source, supports any programming language, and is free to use.
Prefect helps you automate, monitor, and scale Python data workflows with easy orchestration, dynamic scaling, and built-in observability tools.
Dagster helps data engineers build, run, and manage data pipelines with modern orchestration tools for reliable and scalable data platforms.
PGSync lets you sync data from Postgres to Elasticsearch or OpenSearch, making it easy to keep your databases connected and up to date.
Integrate.io lets you build and manage low-code data pipelines to unify, transform, and sync data across sources for analytics and business insights.
Snowplow helps organizations collect, manage, and use customer behavioral data to power AI, analytics, marketing, and digital experiences.
Ab Initio offers powerful tools for building, managing, and integrating data pipelines, helping businesses process and analyze data efficiently.
Vector lets you collect, process, and route observability data quickly and easily. Build flexible data pipelines for logs and metrics across any platform.
Apache Beam lets you build and run unified data processing pipelines for both batch and streaming data, supporting multiple programming languages and cloud platforms.