About Me

My photo
Bangalore, India
I am an Oracle Certified Professional working in SAP Labs. Everything written on my blog has been tested on my local environment, Please test before implementing or running in production. You can contact me at amit.rath0708@gmail.com.

Friday, August 5, 2016

ORA-31623: a job is not attached to this session via the specified handle

Today I was doing an import and got below error while import was initiating :-

Issue :-

Import: Release - Production on Thu Aug 4 22:49:29 2016

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Username: /as sysdba

Connected to: Oracle Database 11g Enterprise Edition Release - 64bit Production
With the Partitioning, Automatic Storage Management, OLAP, Data Mining
and Real Application Testing options

UDI-31623: operation generated ORACLE error 31623
ORA-31623: a job is not attached to this session via the specified handle
ORA-06512: at "SYS.DBMS_DATAPUMP", line 3326
ORA-06512: at "SYS.DBMS_DATAPUMP", line 4551
ORA-06512: at line 1

Cause :-

Above issue can be caused if any registry component is invalid or Oracle is not able to provide sufficient memory to Datapump job.

Solution :-

Crosschecked the DBA _REGISTRY , all components were in VALID state.

Checked the SGA component allocated to database and saw that only SGA_MAX_TARGET is mentioned, SGA_TARGET is 0 . PFB below :-

Datapump use streams pool memory parameter to allocate memory to jobs, if SGA is manually sized then allocate some memory to streams_pool_size

SQL> alter system set streams_pool_size=128M scope=both;

IF ASMM/AMM is used then perform below steps :-

SQL> show parameter sga

NAME                                 TYPE        VALUE
------------------------------------ ----------- ------------------------------
lock_sga                             boolean     FALSE
pre_page_sga                         boolean     FALSE
sga_max_size                         big integer 1280M
sga_target                           big integer 0

Increased the value of SGA target to 4GB

SQL> alter system set sga_max_size=4g scope=spfile;

System altered.

SQL> shu immediate
Database closed.
Database dismounted.
ORACLE instance shut down.
SQL> startup
ORACLE instance started.

Total System Global Area 4275781632 bytes
Fixed Size                  2260088 bytes
Variable Size            4043309960 bytes
Database Buffers          201326592 bytes
Redo Buffers               28884992 bytes
Database mounted.
Database opened.

SQL> alter system set sga_target=4g;

SQL> show parameter sga

NAME                                 TYPE        VALUE
------------------------------------ ----------- ------------------------------
lock_sga                             boolean     FALSE
pre_page_sga                         boolean     FALSE
sga_max_size                         big integer 4G
sga_target                           big integer 4G

Tried import after this and it completed without any errors.

I hope this article helped you

Amit Rath


  1. great Stuff this helps me lot

  2. Very helpful. Thank you for this.

  3. This script saved my life! Thank you!

  4. This is very helpful. Thanks a lot.

  5. Excellent Post!, Thank you very much.

  6. Thanks for all your work posting, I've visited several times for your tips. I just wanted to add another cause for this problem. In my case we had no invalid objects, streams were not set at 0, etc. Production ERP database so we couldn't just restart it. I'll point you toward the Oracle Support note at the end, but the narrative is sometimes this happens if a volume is unmounted, deleted (on storage), recreated and remounted with the same name, it will leave stale file handles cause it is using a cached mount. That was our problem.


  7. Hi,
    Thank you very much. Excellent Solution.

  8. why are we resizing streams_pool_size? Can you please explain?

    1. Thanks for writing in , datapump uses streams_pool_size parameter, if this value is not properly set , it allocates needed memory from SGA for proper execution of Datapump job and if SGA don't have memory to provide to streams pool, data pump job will fail.

      Let me know if I answer you question.

      Amit Rath

  9. Thanks Amit, I couldn't find a solution anywhere else and this helped me. I was doing an import into a PDB. What I notice though is that in the CDB$ROOT all dba_registry values are valid, however in the PDB CATPROC is invalid. I know one doesn't run catproc on a CDB but only catcdb so I was wondering if this is correct? My import worked even with catproc invalid though. Do you have any knowledge if this?

  10. Hello Sir, But it was working perfectly while streams_pool_size value = 0

  11. Good post....thanks for sharing.. very useful for me i will bookmark this for my future needs. Thanks.
    Attache Handle Manufacturer in Delhi