Lift logo


Migrate your data quickly, easily and securely via a command-line interface

Watch Lift CLI Demo


clock icon


High-speed data movement

Lift uses IBM Aspera under the covers to move your data to the cloud at blazing fast speeds.

Lift uses Aspera's patented transport technology leverages existing WAN infrastructure and commodity hardware to achieve speeds that are hundreds of times faster than FTP and HTTP.

bandage icon


Quickly recover from common problems

Automatically recovers from common problems you may hit during the migration. For example, if your file upload is interrupted mid-transfer, Lift will resume where you last left off. File uploads are stable and robust, even over the most bandwidth-constrained networks.

lock icon


Encryption for data in motion

Nobody wants to end up on the front page of the news. Any data moved over the wire to the IBM Cloud is completely secure via a 256-bit encrypted connection.

no charge icon


No strings attached

We want you to try our cloud data services. Cost shouldn't be an issue.

controls icon


Control each migration step

Every data migration is split into three steps: extract from source, transport over the wire, and load into target. Our CLI gives you the flexibility to perform these three steps separately so that your data migration works around your schedule, not the other way around.

auto-update gear icon

Built for the cloud

Always up-to-date

You'll install the Lift CLI only once on your on-premises machine. Under the covers, the CLI works with the Lift Core Services running in the IBM Cloud to help get your data to your Watson Data Platform persistent store. Like any other cloud app, Lift never requires an update. New features are instantly available to you without you having to lift a finger.


We're GA!

We're proud and excited to announce the general availability of the IBM Lift CLI. Over a short amount of time, our users migrated tens of terabytes to our premier cloud data warehouse service, Db2 Warehouse on Cloud. In the meantime, our engineering team has been working hard putting on the final touches and even cranked out a couple of new features. These include: load with external tables, data subsetting on extract via SELECT/WHERE, HIPAA-readiness, and even preview support for the brand new IBM Integrated Analytics System. Use that blue feedback strip to let us know how we're doing. Or, if you get stuck with Lift, go to the Community Help section to reach out to us via Stack Overflow.

Read Blog Post See All Recent Changes

How It Works

Hover over each item for more information.

cloud graphic

Db2 Warehouse

on Cloud


landing zone graphic

Landing Zone

for Db2 Warehouse on Cloud




csv file graphic


csv file graphic


source database graphic




Use Cases

Want to migrate from IBM PureData System for Analytics to IBM Db2 Warehouse on Cloud?

It's a two-step process: convert your schema and migrate your data.

To convert your schema, start by downloading the IBM Database Conversion Workbench . The workbench will walk you through the process of converting your source database DDL so that it is compatible with the target. The workbench will also produce a report that tells you where your action is required. Once your schema is in place, you'll use the Lift CLI to migrate your data.

Get Data Conversion Workbench

You need to keep feeding your warehouse with new data constantly, and the Lift CLI is here to help.

Start by generating a set of CSV files that represent your incremental changes, per database table. Use the Lift CLI to scoop up those delimited files, push them over the wire, and import the files into IBM Db2 Warehouse on Cloud. Throw these steps in a script, set up a cron job, and you've got an ongoing incremental update of your data warehouse.

You can use the Lift CLI to migrate data from multiple different databases or data sources into a single IBM Db2 Warehouse on Cloud MPP cluster. Lift provides you with the flexibility to take tables from multiple data sources and import them under a single schema in IBM Db2 Warehouse on Cloud so that you can decommission your existing database cluster.

Don't slam your transactional data store with reporting queries.

Your customers don't care that you need to run analytics on their buying behavior. They just want a snappy user experience.

Spin up a cloud data warehouse, such as IBM Db2 Warehouse on Cloud to run analytics on data from your transactional data store. Keep your reports and dashboards up to date by sending small amounts of data from the source, and always have an up-to-date view of your business.


IBM Db2 Warehouse

IBM PureData System for Analytics

IBM Integrated Analytics System

Upload and load any data represented in a CSV file format


IBM Db2 Warehouse on Cloud

Download Lift CLI


- 125 MB -


- 95 MB -


- 165 MB -

Minimum local system requirements
  • CPU - 1 Core
  • Memory - 300 MB
  • Disk - 250 MB for the Lift CLI. Additional disk space is recommended for data extraction. Refer to the FAQs for more guidance on disk space requirements for data extractions.
Supported database versions
  • IBM Db2 Warehouse on Cloud (Entry and Enterprise Edition) on IBM Cloud and Amazon Web Services (AWS)
  • IBM Db2 Warehouse ET enabled versions
  • PureData® System for Analytics Version 6.0.3 and above
  • IBM Integrated Analytics System
Supported operating systems
  • MacOS 10.11 and above
  • RedHat Enterprise Linux (RHEL) Server 6 and above
  • SUSE Linux Enterprise Server (SLES) 11 and above
  • Ubuntu 15.10 and above
  • Windows 7 and above
  • Windows Server 2012 and above (all editions)

Convert your schema with IBM Database Conversion Workbench

The IBM Database Conversion Workbench helps you migrate your source schema to IBM Db2 Warehouse on Cloud. It will examine your source DDL and automatically convert it to make the DDL compatible with your target engine. If the Database Conversion Workbench can't convert something automatically, you'll get a report detailing the steps you'll need to take to complete the conversion.


- 260 MB -


- 233 MB -


- 295 MB -


Migrate data to IBM Db2 Warehouse on Cloud in 5 minutes

Grab an instance of IBM Db2 Warehouse on Cloud

As a prerequisite, you'll need your very own instance of IBM Db2 Warehouse on Cloud. If you've purchased one of the enterprise plans, you're all set! If not, you can either get your own IBM Db2 Warehouse on Cloud enterprise cluster or entry instance.

Get and install the Lift CLI


Download the version of the Lift CLI for your operating system.


Unzip the package to a <zip-extract-directory> directory on your hard drive.


To install the Lift CLI, open a terminal window (macOS or Linux) or command prompt (Windows), and navigate to the <zip-extract-directory> directory. Then, run the install.

% <zip-extract-directory>/install <lift-home>

For example:

On Linux: $ sudo <zip-extract-directory>/install /opt/lift-cli

On macOS: % sudo <zip-extract-directory>/install /opt/lift-cli

On Windows: > <zip-extract-directory>\install.bat C:\lift-cli

The lift executable lives in <lift-home>/bin. Once the install completes, you can add <lift-home>/bin to your PATH environment variable. For the rest of this tutorial, we'll assume that <lift-home>/bin is in your PATH and that `lift` is accessible from your terminal.

CSV or Flat Files → IBM Db2 Warehouse on Cloud IBM PureData System for Analytics → IBM Db2 Warehouse on Cloud

Prepare for migration


The following data set has been provided to complete the tutorial. You are free to continue with your own data file and DDL. Download the Boston Property Assessment FY2016 (45.6MB) sample data set (courtesy of Analyze Boston). This package contains a schema (boston_property_assessment_fy2016.schema.sql) and a data file (BOSTON_PROPERTY_ASSESSMENT_FY2016.csv).


Log in to your IBM Db2 Warehouse on Cloud console.


To create a table, complete the following steps:


Copy the contents of boston_property_assessment_fy2016.schema.sql into the DDL box under the Run SQL tab.


Specify a schema by concatenating the schema name with the table name separated by a period. For example, <SCHEMA_NAME>.BOSTON_PROPERTY_ASSESSMENT_FY2016. If a schema is not specified, the table is created in your default schema. The default schema name is your user name in uppercase.


Click Run All. The result is a table called BOSTON_PROPERTY_ASSESSMENT_FY2016 in the specified or default schema.

Move your data


First, move the data file over to the IBM Db2 Warehouse on Cloud landing zone. You'll use this landing zone to stage your CSV file before it's ingested into IBM Db2 Warehouse on Cloud. You'll need your IBM Db2 Warehouse on Cloud for Analytics credentials. You can get these credentials from your IBM Db2 Warehouse on Cloud console by clicking Connect in the side navigation bar.

% lift put --file <path-to-csv-file>/BOSTON_PROPERTY_ASSESSMENT_FY2016.csv --target-user <database-user> --target-password <database-password> --target-host <database-hostname>

Alternatively, you can put these options, such as target-user and target-password into a properties file and reference that file from the command using the -pf option.


Once the file is copied to the landing zone, load the data set into the IBM Db2 Warehouse on Cloud engine.

% lift load --filename BOSTON_PROPERTY_ASSESSMENT_FY2016.csv --target-schema <your-schema-name> --target-table BOSTON_PROPERTY_ASSESSMENT_FY2016 --header-row --remove --file-origin user --target-user <database-user> --target-password <database-password> --target-host <database-hostname>

Here, the --header-row option is used to let the loader know that the first row of my data set contains the column headings. The first row will, therefore, be ignored. Also, the --file-origin user option is used to denote that this CSV file is user-generated, and was not extracted using `lift extract`.


And you're done. You can now go back to the IBM Db2 Warehouse on Cloud console and run SQL queries against the sample data set.

Prepare for migration


First, you'll need to create the schema and table structure on your IBM Db2 Warehouse on Cloud target. You have several options to this, but the most effective way is to download and use the IBM Database Conversion Workbench. This tool will help you convert your existing Netezza schema to one that's compatible with the IBM Db2 Warehouse on Cloud engine. Once the conversion is complete, the Database Conversion Workbench will produce a report to let you know which parts of your source DDL were automatically converted, and which parts require manual intervention. Check out the included step-by-step guide for more information.

Move your data


Once your table structure is in place, we can start moving your Netezza tables over to IBM Db2 Warehouse on Cloud. We'll start by extracting a table to a CSV file. Then, move that file over the wire, stage it in the landing zone on IBM Db2 Warehouse on Cloud, and then load it into the engine.

First, extract the table to a CSV file.

% lift extract --source-schema <schema> --source-table <table> --source-database ADMIN --source-host <netezza-hostname> --source-user <netezza-user> --source-password <netezza-password> --source-database-port <netezza-port> --file <path-to-csv-file>

Alternatively, you can put these options, such as source-user and source-password into a properties file and reference that file from the command using the -pf option.


Next, we'll transport the CSV file over to the IBM Db2 Warehouse on Cloud landing zone. For this, we'll use the `put` command.

% lift put --file <path-to-csv-file> --target-user <database-user> --target-password <database-password> --target-host <database-hostname>


And finally, we'll load your CSV file into the IBM Db2 Warehouse on Cloud engine.

% lift load --filename <csv-file> --target-schema <schema-name> --target-table <table-name> --file-origin extract-pda --target-user <database-user> --target-password <database-password> --target-host <database-hostname>

Here, the --file-origin extract-pda option is used to signify that the CSV file that's being loaded was extracted with Lift using the extract command.


And you're done. You can now go back to the IBM Db2 Warehouse on Cloud console and run SQL queries against the data set.

Copy and paste the following token during installation:

Go to Getting Started tutorial
Optional: Verify the download image