Data Pump

Previous  Index  Next  How To Use

( Pro Edition )


The Data Pump is a tool used to transfer data from one database to another and can be reached via Tools | Data Pump. The Data Pump tool that comes with Database Workbench supports several types of connectivity:


Supported database engine native (either directly through the client libraries or ODBC/ADO)
Open Database Connectivity (ODBC)
ActiveX Data Objects (ADO)


By using ODBC or ADO connectivity, you can transfer data to and from any other database engine.


In this topic

1.User Interface Overview
2.Loading and Saving a "pump project"
3.Table & Column Mappings
4.Target Constant Expressions
5.Limit Source Data (WHERE clause)
6.Custom Source SQL Dataset
7.ODBC & ADO Notes
9.While Running & Progress
10.Advanced Tips


See also:

Building an ADO Connection String
Data Pump Options in the Preferences


User Interface Overview

Below is a screenshot and explanation of different on-screen items:


Data Pump window


As you can see in the above example, we have connected to a Firebird source database  (1) and are connected to NexusDB database called "ADGSample" (2). In both sections, the tables are fetched (3 & 4) and a few table mappings have been made (4).



Back to Top


Loading & Saving a "pump project"

You can load and save your mappings and connection information to a "pump project"-file (extension ".pmp").


When you have saved a project, you can re-load it and the Data Pump will connect to both the source and destination database and displaying the table and column mappings. Any mappings before loading the file will be removed.


If you want to use a particular set of mappings for a different set of databases, you can connect to the source and/or destination databases first and then load the file, the Data Pump will then ask you if you want to keep the current (source or destination) connection.


Keep Current Connection confirmation dialog



Back to Top


Table & Column Mappings

Such a mapping is easily created by dragging the source table to the destination table and dropping it there. After such a drop, an attempt will be made to match column name - this matching process can be adjusted on the Options tab - it does a case insensitive match and filters any non-standard characters such as ' # $ _ etc...


Any columns that don't get matched will stay empty until you drag 'n drop a source column to the destination column.


The transfer order can be controlled via the UP/DOWN arrow buttons on the Data Pump toolbar. This can be very useful if your target database has foreign key constraint defined, additionally, you can let Database Workbench determine the order automatically by examining foreign key dependencies, use the Determine Table Order button for that.


Determine Table Order button


You can also use the Auto-map button to automatically match table & column names for all available tables.


Auto-map button



Back to Top


Constant Expressions

Besides source database values, you can also choose to use a constant or expression of some kind that results in a valid value and uses valid SQL for that particular database engine. Do a context-mouse click on a column to enter such an expression.


Expression Example


In the above example, you can see that while transferring data, the destination column ACCOUNTS.ACT_ID will be filled by using a generator. Any expression that results in a valid SQL value is possible here.



Back to Top


Where Clauses

For the source table, you can also add a custom SQL WHERE clause to filter data for the destination database. Do a right-mouse click to enter such a WHERE clause.


A WHERE Clause


The above WHERE clause results in ENDBALL records being transferred with a value for SUMS above 10000.



Back to Top


Create new Source SQL

You can also use a custom SQL statement as the source of data to transfer. Click the Create New Source SQL button in the toolbar and a simplified SQL Editor will be displayed.


Simplified SQL Editor (click to expand)
Simplified SQL Editor


In this SQL Editor, you can create and test your SQL statement. If you're satisfied with the results, close the editor to accept the changes. The SQL has to be "named" and you can enter a name for it.


If you want to delete it, simply use the context menu in the treeview and select Delete from the popup menu. Using a double click or Edit SQL from the menu will open up the editor again.


You can expand the SQL item just as you expand the tables and look at its columns.



Back to Top


ODBC Notes

Using an ODBC datasource requires a registered ODBC data source name. These can be created from the Windows Control Panel applet for ODBC datasources. This applet can also be reached via the 'Edit ODBC Datasources' button next to the ODBC Datasource drop down combo.


The Edit ODBC Datasources button


ADO Notes

An ADO connection requires a quite complex connection-string. By clicking the 'Edit Connection string, a dialog will popup that lets you select from different ADO drivers and select your settings that will be passed to the ADO connection, see building an ADO connection string.
If you have tables & columns that use non-standard characters like spaces, select the SQL syntax that you want to use from the drop down box.



Back to Top



The Data Pump has options available to influence how the data transfer process, the name matching when mapping and some other options, these are defaulted as per application Preferences.


Options section on the Data Pump


The options under Name Match are used when performing automatic matching only, that is, the automatic column matching when you drag a table from the source to the destination or when you choose to auto-match all tables.


The following options are available:

Records per single transaction

This tells you how many records should be transferred before a database COMMIT is executed. Failed INSERT attempts do not count towards this limit. If you enter "0", all rows of the source will be transferred to the destination in a single transaction.


Transfer all data in a single transaction


This will wrap the transfer process of all source-to-destination items into a single transaction.

Do not COMMIT if an error occured

Don't commit the transfer of records when at least 1 record cannot be inserted.


Empty tables before data-transfer

Empties the target tables before starting the transfer. This will empty all the tables that are participating in the data transfer process.


Disable indices/triggers/constraints

If the target database engine allows it, these objects can be disabled before starting the data transfer, they will be re-enabled afterwards.



Back to Top


Running Notes & Progress

By default, the Data Pump will open an additional connection to your databases and optimize the running process for concurrency - that is, you will be able to continue working with Database Workbench. If you don't mind a stalled Database Workbench instance, click the arrow next to the Start Data Transfer Process and select Optimize for Speed from the menu.


During the data transfer process, tab page Process & Messages is activated.


Process & Message during and after the data transfer process


The left hand side of that tab holds a list with the mapped data transfers (source to destination), the right hand side holds process messages and error results on a per transfer basis. Selecting [none] at the top left will list a summary of available messages, selecting a Transfer-item lists errors and messages for that particular transfer.


The Transfer-items can be in bold meaning that item is currently being processed and red indicating there were errors during processing.



Back to Top


Advanced Tips

1) If you need to convert data while transferring from one table to another, you can use a "custom SQL source" in a specific way.


Instead of using a simple (or complex) SQL statement to return results, you can also use a Stored Procedure that generates or converts data. Or you can use functions in your SQL to, for example, trim the contents of columns before doing the transfer.


The possibilities are endless.


2) Although you can use an ORDER BY clause on each table or view, this will still transfer all rows unless you specify a WHERE clause. But you cannot, for example, transfer the "first 100 rows". Again, you can easily use the "custom SQL source" to do this. Use a SELECT with a clause that trims the number of rows.



Back to Top