ETL Tools  News & Reviews 
Featured ETL Tool: 
 Intelligent Integration ETL
Modern Metadata Based ETL that
Simplifies, Automates & Accelerates

A Data Dictionary Manages
ETL Functionality Without Coding

Design Wizards Quickly Generate the Data Dictionary
From Any Data Source

Data Managed ETL Reduces
Life Cycle Costs by 80%

Easy Deployment to Cloud or On-Premise

Supports Most ETL Use Cases:
Streaming, Batch, Rest, ORM, JDBC and Files

Metadata Managed
Data Integration
Simplifies, Automates & Accelerates ETL

A Data Dictionary Manages ETL Functionality 

Design Wizards Generate the Data Dictionary From Any Data Source

Data Managed ETL Reduces Life Cycle Costs by 80%

Public Cloud, Private Cloud or On-Premise

Supports Most ETL Use Cases:
Streaming, Batch, Rest, ORM, JDBC and Files

ETL Software Comparison

Feature

Intelligent Integration (II)

Informatica

Microsoft SSIS

Ab Initio

AWS Glue

Data Dictionary

Yes
Additional Cost
No
Yes
Yes

Metadata Managed

Yes
No
No
Yes
Mappings Only

Dynamic Metadata-Schema

Yes
No
No
No
No

Near Codeless

Yes
No
No
Yes (Need Rules Defined)
No

Declarative ETL

Yes
No
No
No
No

Design Wizards

Yes
No
No
Yes (Need Rules Defined)
Yes (Mappings Only)

Json / Hierarchical Data

Yes
Mapping Only
Mapping Only
Mapping Only
Mapping Only

Streaming ETL

Yes
Additional Cost
No
Yes
No

Near Real Time ETL

Yes
Additional Cost
No
Yes
No

Job / Error Reporting

Yes
Yes
No
Yes
No

Platform

Cloud App (Java)
Proprietary Server
Proprietary Server
Proprietary Server & OS
Proprietary Cloud

Price

Free*
$$$
$$
$$$$
$$
* Support likely needed

Synopsis

Intelligent Integration's value proposition to ETL is similar to what WordPress provides in creating a website. You install it as cloud application and setup is a matter of configuration vs coding. Our design wizards guide you by extracting schema information to populate a data dictionary style repository. The data dictionary can be further enhanced with simple tags to implement all types of ETL functionality. An intelligent rules engine implements ETL from the data dictionary and tags. The intelligence is that the data dictionary simply and cleanly represents your business requirements. The rules engine implements the complex technical aspects of ETL.

Declarative
Programming
Paradigm

The declarative programming paradigm is an extremely popular methodology to simplify and automate all types of  technologies. It's main goal is to separate the business requirements of an application (what needs to be done) from it's technical implementation. Intelligent Integration is applying that methodology to ETL technology in a clever way. The data dictionary keeps simple ETL simple while expandable to cover more complex data requirements.

Data
Managed
ETL

Managing data is inherently more cost effective than managing custom programming code. We maintain the data dictionary in a metadata database. This metadata is a combination of the data model, schema and ETL requirements. Using our declarative ETL we are able to keep the data dictionary simple. We support ad hoc SQL for mass updates to the metadata. The metadata can also be managed as a master data management solution, something we call Master Schema Management. This allows application developers to synchronize with data integration. There are countless value propositions to data managed ETL.

Responsive
Dynamic
Application

Since we're data managed with metadata you can choose between a design time ETL designer wizards or a dynamically managed, rules based ETL server. These concepts work well together. Start with you initial design, then as new schema changes occur at the source they can be automatically added to the data dictionary. Schema changes can also be applied to the destination.

Java
Platform

Our foundation is a Java platform with rules classes and a workflow to implement ETL. These are preconfigured for you. You choose one for your use case. Example include NoSQL to SQL, data warehousing, dimensional modeling, Salesforce etc. The metadata completes the ETL configuration. The platform supplies a multi-threaded ETL engine plus management of metadata, job/batch, schema, errors and cache. Our Java technology can leverage nearly any Java library to connect to almost any data connection.

Data Transforms

Lookups (SQL, REST or Cached)
Pivot/UnPivot
Split (Several Methods)
Merge or Union
Aggregates
Filters

Data Modeling

Primary Keys/Foreign Keys
Surrogate Key Management
Dimensional Modeling
Master Data Management Integration
External Data Integration

Automatic Data Cleansing

Primitive Datatype Cleansing
Form Value Cleansing (Address, Email etc)
External Data Quality Integration
Irregular Data Error Logging
Master Schema Management

Data Warehouse/Big Data

Node/Partition Aware
Multithreaded
Write Buffering per Node/Partition
Isolation of Read/Transform/Write for Scalability
Integrate Hadoop Java Libraries

DBA & Operations

Standard Job Management
Restartable Jobs
Incremental/Full Load Job Integration
Destination Error Automatic Retry
Detailed Job Monitoring/Reporting

QA/Data Quality Tools

Record Count Reporting
Schema Compares
Data Compares Using Metadata
Detailed Error Logging/Reporting

Extensible

Compute Columns or Json Elements
User Defined Functions
Extensible Java Framework for ETL
Metadata Extendable

NoSQL Integration

Metadata Supports Hierarchical Data
Automatic Detection of Schema Changes
Automatic Normalization of Semistructured Data
Automatic Datatype Detection

Government Compliance

Meet HIPAA, EU, GSA PII Requirements
Tracking/Audit of PII Data
Remove or Obfuscate PII Data
Data Dictionary Report for Documentation

Frequently Asked Questions

How can simple tags invoke "advanced" ETL?

If you gave 10 ETL programmers identical non-trivial requirements you would get back 10 different  ETL coding packages. Intelligent Integration has simply prebuilt all the technical functionality. All the subjective programming decisions has been replaced with ETL best practices.

Is Intelligent Integration fast?

Some people assume metadata managed ETL must be slow. In fact Intelligent Integration is very lightweight and fast. We are not an "interpretive" technology but 100% fully compiled Java code after start. We are memory and CPU efficient using only lightweight Json documents during data processing. Intelligent Integration is fully multithreaded.

What is Declarative ETL?

Our core innovation is that ETL can implemented using the  Declarative Programming Paradigm. We separated the technology of "how to implement ETL" from the our data dictionary that states "what ETL to do". We also discovered this approach allows for the coordination of database schema with ETL.

Articles

Use Case: Load Elasticsearch ETL from SQL
Low latency multi-table SQL data is pulled and merged into a hierarchical Json document into Elasticsearch There is a strong[...]
Declarative ETL Cost Savings
Legacy ETL solutions are a procedural programming exercise. An ETL process starts with a source query pulling data, then through[...]
Top 15 reasons to use a Json ETL tool
Advantages of using declarative json ETL software Intelligent Integration is a powerful enterprise declarative ETL tool based on internal Json[...]
Use Case: Automatic ETL schema generation
Semistructured source data with automatic ETL schema generation at SQL destination Problem: A Software as a Service (SaaS) company stores OLTP[...]
Data Lake vs Integrated Data Warehouse ETL
Data Warehouse ETL vs Data Lake ETL Intelligent Integration makes a fundamental change of the cost calculus between a Data[...]

Enter your text here...