Are you struggling to export your data from Oracle? Does this step make you apathetic in your work? If your answer is yes then you have landed on the right page. This blog will take you through the various ways by which Oracle Data Pump exports data from your database with necessary screenshots.

What is Oracle?

Oracle Data Pump Export: Oracle Logo | Hevo Data

Oracle is a Relational Database System (RDBMS) developed by Oracle Corporation on 16 June 1977. Oracle is the most famous among all the relational databases, sometimes also called Oracle DB. Oracle is the RDBMS that implements object-oriented features like user-defined types, inheritance, and polymorphism, so it is also called an Object-Relational Database Management System (ORDBMS).

Oracle database was the first Database designed for data warehousing and enterprise grid computing. Enterprise grid computing makes oracle a flexible and cost-effective database to manage data. SQL queries are used to access data from Oracle.

There are various databases editions of oracle are available which gives flexibility to oracle users to select editions according to their specific demands with a cost-effective solution.

Oracle keeps an eye on enterprise’s needs and keeps on updating technological developments. Oracle products are always updated with new features. Recently, the Oracle database is also available on Oracle Cloud. The next-generation oracle cloud is designed to run any application, faster and more securely, with less investment. 

What are the Key Features of Oracle?

  • Reliability: OracleDB offers advanced security features such as Oracle Database Vault and Oracle Label Security, ensuring secure and private database services.
  • Availability: Maintains 24/7 availability through Oracle Data Guard and Real Application Clustering (RAC), enabling seamless failover.
  • Scalability and Performance: Features like Real Application Clustering and Portability ensure high scalability and top-notch performance, managing data consistency and concurrency effectively.
  • Portability: Oracle database can be ported across over 100 hardware platforms and 20 networking protocols, facilitating secure application development.
  • Backup and Recovery: Designed with RAC, Oracle ensures backup and recovery capabilities for data and processes in case of failures.
  • Analytics Solutions: Provides analytical solutions like OLAP and Oracle Advanced Analytics for robust business data analysis.

What is Data Pump?

Oracle offers the export and import utility to its customers in the form of its Data Pump. Oracle versions 10g and above can use this feature. Data Pump offers the high-speed movement of data and metadata from one database to another. 

Oracle Data Pump Export utility is invoked using expdp and Oracle Data Pump Import is invoked using impdp. Your database, tablespace, schema, or table can be exported using Oracle Data Pump Export.

Difference Between Data Pump and Export/Import Utility

The differences between Oracle Data Pump Export/Import and traditional Export/Import are listed below:

  • Data Pump offers Export and Import utility by Oracle. This method is more flexible and faster than traditional Export and imports.
  • Data Pump operates on dump files whereas the traditional method involves a single file. 
  • Data Pump offers better performance as expdp and impdp use parallel execution during processing whereas the traditional method uses only a single execution during processing.
  • Data Pump is a utility offered by the server whereas the conventional method allows access from both client and server-side.

What Is Data Pump Export?

Data Pump Export (affectionately known as Export) is a program that unloads data and metadata into a set of operating system files known as a dump file set. Only the Data Pump Import utility can import the dump file set. The dump file set can either be imported on the same system or moved to another and loaded there.

  • One or more disc files that contain table data, database object metadata, and control information make up the dump file set. The files are stored in a binary format that is proprietary to the company. The Data Pump Import utility uses these files to locate each database object in the dump file set during an import operation.
  • The database administrator (DBA) must create directory objects that define the server locations to which files are written because the dump files are written by the server rather than the client.
  • Data Pump Export allows you to tell a job to move only a subset of the data and metadata, based on the export mode. Data filters and metadata filters, which are specified through Export parameters, are used to accomplish this.

What are Data Pump Export Modes?

Different modes for unloading different parts of the database are available in Export. The mode is specified using the appropriate parameter on the command line. The following sections go over the various modes that are available:

  • “Full Export Mode”: The FULL parameter is used to specify a full export. The entire database is unloaded in a full database export. The DATAPUMP EXP FULL DATABASE role is required for this mode.
  • “Schema Mode”: The SCHEMAS parameter is used to specify a schema export. The default export mode is this. If you have the DATAPUMP EXP FULL DATABASE role, you can specify a list of schemas, which can optionally include the schema definitions as well as system privilege grants.
    • You can only export your own schema if you don’t have the DATAPUMP EXP FULL DATABASE role. For export jobs, the SYS schema cannot be used as a source schema. Cross-schema references aren’t exported unless the referenced schema is included in the list of schemas to export.
    • A trigger defined on a table in one of the specified schemas but located in a schema not explicitly specified, for example, is not exported. This holds true for external type definitions that are used by the tables in the specified schemas. In this case, the type definitions should already be present in the target instance at the time of import.
  • “Table Mode”: The TABLES parameter is used to specify a table mode export. Only a specific set of tables, partitions, and their dependent objects are unloaded in table mode. Only object metadata is unloaded when the TRANSPORTABLE=ALWAYS parameter is combined with the TABLES parameter. You copy the data files to the target database to move the actual data.
    • As a result, export times are reduced. When moving data files between releases or platforms, Oracle Recovery Manager may be required to process the data files (RMAN).
  • “Tablespace Mode”: The TABLESPACES parameter is used to specify a tablespace export. Only the tables in a specified set of tablespaces are unloaded in tablespace mode. When a table is unloaded, all of its dependent objects are unloaded as well.
    • Both the metadata and the data for the object are unloaded. If any part of a table is in the specified set in tablespace mode, the table and all of its dependent objects are exported. All tables are available to privileged users. Only the tables in their own schemas are available to unprivileged users.
  • “Transportable Tablespace Mode“: The TRANSPORT TABLESPACES parameter is used to specify a transportable tablespace export. Only the metadata for the tables (and their dependent objects) within a specified set of tablespaces is exported in transportable tablespace mode. In a separate operation, the tablespace data files are copied.
    • After that, a transportable tablespace import is used to import the metadata dump file and specify the data files to use. The specified tables must be completely self-contained in order to use transportable tablespace mode. That is, the tablespace set must contain all storage segments of all tables (and their indexes).
    • If there are any self-containment violations, Export will identify all of the issues before proceeding with the export. Once stopped, transportable tablespace exports cannot be restarted. They also can’t have more than one degree of parallelism. In transportable tablespace mode, encrypted columns are not supported.

Prerequisites

  • Oracle database version 10g or above. Find the newest versions of the Oracle database here.
  • Knowledge and experience in RDBMS.

Method 1: Oracle Data Pump Export in SQL Developer

Let’s discuss the Data Pump Export process in SQL Developer.

  1. Connect to your SYS Database
  2. Grant Access to your Database
  3. Create a Directory Object for your Database
  4. Open Data Pump Export Wizard of your Database
  5. Export your Database 

Step 1: Connect to your SYS Database

Connect to a privileged user such as a system or sys. 

Connect to SYS Database | Hevo Data

Step 2: Grant Access to your Database

Grant DBA access to the user, who contains your desired database, schema, tablespace or table by executing the following query in the worksheet.

GRANT DBA TO username;  
Granting Access to Database | Hevo Data

Step 3: Create a Directory Object for your Database

A directory object is like a pointer pointing to the directory. Create an object for the directory using the following command:

CREATE OR REPLACE DIRECTORY directory_object_name AS directory_location;

Step 4: Open Data Pump Export Wizard of your Database

In the menu bar, select View and click on the DBA option.

Opening DBA from View in SQL Developer | Hevo Data

Select the new connection option in the DBA panel and then select the connection containing your desired data. Provide the correct credentials and open the database with your data.

Selecting New Connection | Hevo Data

In the DBA panel expand your connection and then select Data Pump. Right-click on Data Pump and select Data Pump Export Wizard. 

Wizard in SQL Developer | Hevo Data
Oracle Data Pump Export Wizard in SQL Developer

Step 5: Export your Database 

In the Export Wizard, select the option which you want to export. The types that can be exported using Data Pump are as follows:

  1. Database
  2. Tablespace
  3. Schema
  4. Table

Select Next and proceed.

Selecting Schema Source | Hevo Data
Oracle Data Pump Export – Selecting Schema Source

Select the schema which you want to export and then select ‘>’ in order to include it in the process of exporting.

Selecting Schemas | Hevo Data
Oracle Data Pump Export – Selecting Schemas

Similarly, select the name of the tables which you want to export and click the down arrow. If you want to export all the tables of that schema then select the double down arrow and click next.

Filtering
Oracle Data Pump Export - Choosing Table Data | Hevo Data

In the Options, select the correct directory object name in the Log File drop-down. You can also change the name of the log file. 

Oracle Data Pump Export - Configuring Options | Hevo Data
Configuring Options

In the Output Files, select the correct directory object name in Directories drop-down. You can also change the name of the dump file.

Oracle Data Pump Export - Choosing Output Files | Hevo Data
Choosing Output Files

Click Finish.

Oracle Data Pump Export - Reviewing Summary of Export | Hevo Data
Reviewing Summary of Export

Method 2: Oracle Data Pump Export in SQL *Plus

 If you are a SQL *Plus user then you can follow these steps for Oracle Data Pump export.

  1. Create a Directory Object for your Directory
  2. Grant Permission to your User
  3. Export your Data

Step 1: Create a Directory Object for your Directory

A directory object is like a pointer pointing to the directory. This step should be done by the privileged users only for example, ‘sys’ user. 

In the command prompt, write the following command in order to connect to SQL Plus.

sqlplus / as sysdba
Connected to Oracle Database | Hevo Data
Connected to Oracle Database

Create an object for the directory by using this command:

CREATE DIRECTORY directory_object_name AS location; 
Creating Directory for Export in Local System | Hevo Data
Creating Directory for Export in Local System

Step 2: Grant Permission to your User

Grant permission such as read and write for the directory, to the user who wants to export the data.

GRANT READ, WRITE ON DIRECTORY directory_object_name TO username;
Granting Read and Write Access on Directory | Hevo Data
Granting Read and Write Access on Directory

Grant export privilege to the user using the following command.

GRANT DATAPUMP_EXP_FULL_DATABASE TO username;
Grant export privilege | Hevo Data
Grant export privilege

Step 3: Export your Data

Exit from SQL Plus and write the following command in the command prompt which uses ‘expdp’.

expdp username/password DIRECTORY=directory_object_name DUMPFILE=dump_file_name.dmp LOGFILE=log_file_name.log SCHEMAS=schema_name;

Hevo Data – An Efficient Alternative to Export Data

Hevo is a No-code Data Pipeline. Hevo helps you export data from Oracle to any destination such as a data warehouse. Hevo integrates data from 100+ sources and loads it into a database or a data warehouse. Hevo offers secure and reliable data transfer at a reasonable price.

Get started with hevo for free

Some of the salient features of Hevo are listed below:

  1. Easy Implementation: Hevo offers hassle-free setup and implementation in minutes. 
  2. Fully Automated: Manually loading your data from the source to the destination is a complex and time-consuming process. Hevo makes this process a cakewalk for its customers as it is a fully automated platform. 
  3. Zero Maintenance: Once you are done with the set-up, Hevo takes care of everything. 
  4. No Data Loss: Hevo offers a fault-tolerant algorithm, ensuring that there is no data loss.
  5. Secure: End-to-end encryption and two-factor authentication ensure that your data is secured. 

Experience all these amazing features of Hevo and sign up today for a 14-day free trial.

How to Filter Data during Export Operations?

Data Pump Export allows you to filter data and metadata to help you limit the type of information that is exported.

1. Data Filters:

The QUERY and SAMPLE parameters, which specify constraints on the table rows to be exported, are used to implement data-specific filtering.

Metadata filtering, which can include or exclude table objects as well as any associated row data, can also cause data filtering indirectly.

Each data filter in a job can only be set once per table. If multiple filters with the same name are applied to a single table as well as the entire job, the filter parameter for the single table takes precedence.

2. Metadata Filters:

  • The EXCLUDE and INCLUDE parameters are used to implement metadata filtering. The parameters EXCLUDE and INCLUDE are mutually exclusive.
    A set of objects to include or exclude from an Export or Import operation is identified by metadata filters. You could, for example, ask for a full export without the Package Specifications or Package Bodies.
  • Remember that dependent objects of an identified object are processed along with the identified object in order to use filters correctly and get the results you want.
  • If a filter specifies that an index should be included in an operation, for example, statistics from that index will be included as well. If a filter excludes a table, the filter will also exclude the table’s indexes, constraints, grants, and triggers.
  • When multiple filters are specified for an object type, an implicit AND operation is used to combine them. In other words, objects related to the job must pass all of the filters applied to their object types.

The same metadata filter name can be specified multiple times within a job.

Examine the viewpoints. To see a list of valid object types, type DATABASE EXPORT OBJECTS in full mode, SCHEMA EXPORT OBJECTS in schema mode, and TABLE EXPORT OBJECTS in table and tablespace mode. The object types listed in the OBJECT PATH column are valid. For instance, you could use the following query:

SQL> SELECT OBJECT_PATH, COMMENTS FROM SCHEMA_EXPORT_OBJECTS
  2  WHERE OBJECT_PATH LIKE '%GRANT' AND OBJECT_PATH NOT LIKE '%/%';

The following is an example of the query’s output:

OBJECT_PATH
--------------------------------------------------------------------------------
COMMENTS
--------------------------------------------------------------------------------
GRANT
Object grants on the selected tables
 
OBJECT_GRANT
Object grants on the selected tables
 
PROCDEPOBJ_GRANT
Grants on instance procedural objects
 
PROCOBJ_GRANT
Schema procedural object grants in the selected schemas
 
ROLE_GRANT
Role grants to users associated with the selected schemas
 
SYSTEM_GRANT
System privileges granted to users associated with the selected schemas

Examples of Using Data Pump Export

  • Performing a Table-Mode Export: The TABLES parameter is used to specify a table-mode export in this example. To export the tables of employees and jobs from the human resources (hr) schema, run the following Data Pump export command:
expdp hr TABLES=employees,jobs DUMPFILE=dpump_dir1:table.dmp NOLOGFILE=YES

It is not necessary to specify the schema name for the tables because user hr is exporting tables in his own schema. The parameter NOLOGFILE=YES indicates that the operation’s Export log file will not be created.

  • Data-Only Unload of Selected Tables and Rows: The contents of a parameter file (exp.par) that you can use to perform a data-only unload of all tables in the human resources (hr) schema except the tables countries and regions are shown in this example. Rows in the employee’s table with a department id greater than 50 are unloaded. Employee id is used to sort the rows.
DIRECTORY=dpump_dir1
DUMPFILE=dataonly.dmp
CONTENT=DATA_ONLY
EXCLUDE=TABLE:"IN ('COUNTRIES', 'REGIONS')"
QUERY=employees:"WHERE department_id !=50 ORDER BY employee_id"
  • Estimating Disk Space Needed in a Table-Mode Export: This Example demonstrates how to use the ESTIMATE ONLY parameter to estimate how much space a table-mode export will take up without actually performing the export. Use the BLOCKS method to estimate the number of bytes needed to export data from the employees, departments, and locations tables in the human resource (hr) schema.
> expdp hr DIRECTORY=dpump_dir1 ESTIMATE_ONLY=YES TABLES=employees, 
departments, locations LOGFILE=estimate.log

The estimate is saved in the log file and shown to the client on their standard output device. The estimate only includes table row data and excludes metadata.

  • Performing a Schema-Mode Export: The hr schema is exported in schema-mode in this example. Only objects from the corresponding schemas are unloaded in a schema-mode export. Because schema mode is the default mode, the SCHEMAS parameter on the command line isn’t required unless you’re specifying multiple schemas or a schema that isn’t your own.
> expdp hr DUMPFILE=dpump_dir1:expschema.dmp LOGFILE=dpump_dir1:expschema.log
  • Performing a Parallel Full Database Export: This example shows a complete database export with up to three parallel processes (worker or PQ slaves).
> expdp hr FULL=YES DUMPFILE=dpump_dir1:full1%U.dmp, dpump_dir2:full2%U.dmp
FILESIZE=2G PARALLEL=3 LOGFILE=dpump_dir1:expfull.log JOB_NAME=expfull

All data and metadata in the database will be exported because this is a full database export. Full101.dmp, full201.dmp, full102.dmp, and so on will be created in the directories pointed to by the dpump dir1 and dpump dir2 directory objects in a round-robin fashion. These should be on separate I/O channels for optimal performance.

As needed, each file will be up to 2 gigabytes in size. Up to three files will be created at first. If necessary, more files will be created. expfull will be the name of the job and the master table. The log file will be saved in the dpump dir1 directory as expfull.log.

  • Using Interactive Mode to Stop and Reattach to a Job: Rerun the parallel full export in Examples 2-5 to begin this example. Press Ctrl+C while the export is running. This will launch Data Pump Export’s interactive command interface. Logging to the terminal is disabled in the interactive interface, and the Export prompt appears. To stop the job, type the following command at the Export prompt:
Export> STOP_JOB=IMMEDIATE
Are you sure you wish to stop this job ([y]/n): y

The job is put on hold and the client is terminated.

To reattach to the job you just stopped, type the following command:

> expdp hr ATTACH=EXPFULL

You can use the CONTINUE CLIENT command to resume logging mode and restart the expfull job after the job status has been displayed.

Export> CONTINUE_CLIENT

The job has been reopened, and the client is informed of the processing status.

Conclusion

You have learned how to use Oracle Data Pump Export in two different ways. If you prefer writing queries for your export process then using command-line SQL *Plus is the best way whereas if you don’t want to write queries then working with SQL Developer will be beneficial. 

If you want to migrate your data from Oracle to another database or a data warehouse then you should opt for this No-code Data Pipeline, Hevo. Hevo is a fully-automated data pipeline that requires no monitoring from your end.

Want to take Hevo for a spin? Sign Up or a 14-day free trial and experience the feature-rich Hevo suite firsthand. Also checkout our unbeatable pricing to choose the best plan for your organization.

Let us know about your experience with Oracle in the comments section below.

Satyam Agrawal
CX Engineer

Satyam boasts over two years of adept troubleshooting and deliverable-oriented experience. His client-focused approach has enabled seamless data pipeline management for numerous SMEs and Enterprises. Proficient in Hevo’s ETL architecture and skilled in DBMS sources, he ensures smooth data movement for clients

No-code Data Pipeline for Oracle