Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Oracle Error on creating new Batch Job [BATCH-2475] #1127

Closed
spring-projects-issues opened this issue Feb 18, 2016 · 31 comments
Closed

Oracle Error on creating new Batch Job [BATCH-2475] #1127

spring-projects-issues opened this issue Feb 18, 2016 · 31 comments

Comments

@spring-projects-issues
Copy link
Collaborator

Peter Schäfer opened BATCH-2475 and commented

When launching a Batch Job, the following exception is thrown:
2016-02-18 03:20:01.686 ERROR 16635 --- [nio-8082-exec-6] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is org.springframework.dao.CannotSerializeTransactionException: PreparedStatementCallback; SQL [INSERT into BATCH_JOB_INSTANCE(JOB_INSTANCE_ID, JOB_NAME, JOB_KEY, VERSION) values (?, ?, ?, ?)]; ORA-08177: can't serialize access for this transaction
; nested exception is java.sql.SQLException: ORA-08177: can't serialize access for this transaction
] with root cause

java.sql.SQLException: ORA-08177: can't serialize access for this transaction

See attached stack trace.

The error is hard to reproduce.
We tried the following work-around:
<batch:job-repository id="jobRepository"
...isolation-level-for-create="READ_COMMITTED"
... />

with little success.


Affects: 3.0.6

Reference URL: https://stackoverflow.com/questions/22364432/spring-batch-ora-08177-cant-serialize-access-for-this-transaction-when-running

Attachments:

6 votes, 11 watchers

@spring-projects-issues
Copy link
Collaborator Author

Peter Schäfer commented

Hello Guys,

would you please answer this bug report.
I pretty sure I´m not the only one having this problem...

Best, Peter

@spring-projects-issues
Copy link
Collaborator Author

Michael Minella commented

What additional information can you provide about the use case? I see jobs are being launched from servlets. What more can you tell me about the app and how transactions are handled within the app?

@spring-projects-issues
Copy link
Collaborator Author

Peter Schäfer commented

Spring Batch is running as a standalone application.
Jobs are launched through Spring Batch Admin,
or by http requests like this:

curl -Ld 'jobParameters=...' http://...:8080/jobs/SomeJob.json

Transactions are handled in a standard fashion, with method annotations @Transactional.
However, the error occurs before the job is launched.

Here are a few excerpts from our configuration file:

<aop:aspectj-autoproxy proxy-target-class="true" />
<context:annotation-config />

...
<bean id="dataSource"
class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="driverClassName" value="..." />
<property name="url" value="..." />
<property name="username" value="..." />
<property name="password" value="..." />
</bean>
 
<bean id="entityManagerFactory"
class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean"
p:dataSource-ref="dataSource" p:persistenceUnitName="...">
<property name="jpaVendorAdapter">
<bean p:generateDdl="false" p:showSql="false"
class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter" />
</property>
<property name="persistenceXmlLocation" value="classpath:/META-INF/batch/persistence.xml" />
</bean>
...
<bean id="transactionManager" class="org.springframework.orm.jpa.JpaTransactionManager"
p:entityManagerFactory-ref="entityManagerFactory" />
<tx:annotation-driven proxy-target-class="true" />
...

<batch:job-repository
id="jobRepository"
data-source="dataSource"
transaction-manager="transactionManager"
isolation-level-for-create="READ_COMMITTED"
table-prefix="BATCH_" />

@spring-projects-issues
Copy link
Collaborator Author

Michael Minella commented

Where are you using @Transactional as it's known to not play well with batch jobs?

@spring-projects-issues
Copy link
Collaborator Author

Peter Schäfer commented

Usually with ItemReaders and ItemWriters, like:

@Override
@Transactional
public void open(Serializable checkpoint) { ...

@Override
@Transactional
public void writeItems(List<Object> items) throws Exception { ...

@Override
@Transactional
public void close() throws Exception { ...

What would be a better way of handling transactions?

@spring-projects-issues
Copy link
Collaborator Author

Michael Minella commented

Spring Batch handles the transactions automatically so you don't need to do anything. Using @Transactional on batch artifacts like that interferes with our transaction handling. Please remove them and see if that addresses your issue.

@spring-projects-issues
Copy link
Collaborator Author

Peter Schäfer commented

I'll give it a try ...

Thanks.

@spring-projects-issues
Copy link
Collaborator Author

Peter Schäfer commented

Tried to remove all @Transactional annotations.
The problem still persists...

The error is easy to reproduce on a newly set-up repository
(with an Oracle 12 DB).

Some people recommend to modify the "initrans" settings - no success, however.

@spring-projects-issues
Copy link
Collaborator Author

Peter Schäfer commented

None of the recommended work-arounds works reliably.

The problem can be reproduced on newly created database schemes,
and it somehow disappears after a number of unsuccessful attempts.
Which is not very satisfactory :-(

Any more ideas?

@spring-projects-issues
Copy link
Collaborator Author

Sebastian Droeppelmann commented

We circumvented the error by using two datasources, one for spring batch related commits, one for the actual batch process. Then the error went away. It has to do something with the oracle serialization (who would have guessed :P ) But I found more accounts on this problem (SO) and also encountered it myself. The above mentioned way was the only reliable way to fix this.

@spring-projects-issues
Copy link
Collaborator Author

Mz commented

@sdroeppel I tried this solution and it works randomly. Do i need just change the datasource for jobrepository or i need to change it for jobExplorer too? Does it need to have separate transaction manager too?

<bean id="jobRepository" class="org.springframework.batch.core.repository.support.JobRepositoryFactoryBean">
       <property name="databaseType" value="ORACLE"/>
       <property name="dataSource" ref="dataSource" />
       <property name="transactionManager" ref="transactionManager" />
       <property name="isolationLevelForCreate" value="ISOLATION_READ_UNCOMMITTED"/>
   </bean>

@spring-projects-issues
Copy link
Collaborator Author

Kun liu commented

We are bugged by the same error at the moment. Any updates?

@spring-projects-issues
Copy link
Collaborator Author

Yavdhesh commented

I am stuck with the same problem

 

@Bean
public JobRepository createJobRepository() throws Exception {
	LOG.info("maayne aapana JobRepository  maay");
	LOG.info("Job Repository being created and isolationLevelForCreate is set to = " + isolationLevelForCreate);
	JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
	factory.setDataSource(primaryDataSource());
	factory.setTransactionManager(transactionManager());
	LOG.info("Ye transaction manager object hai" + transactionManager());
	factory.setIsolationLevelForCreate(isolationLevelForCreate);
	factory.setMaxVarCharLength(1000);
	factory.setDatabaseType("ORACLE");
	return factory.getObject();
}

@spring-projects-issues
Copy link
Collaborator Author

Raj Kumar Gupta commented

I have got workaround for this issue.

Follow below step.

  1. [enter link description here][Schema Creation Script] manually create table in your database.
  2. insert some dummy records in BATCH_JOB_INSTANCE , BATCH_JOB_EXECUTION and BATCH_JOB_EXECUTION_PARAMS table. (don't forget to commit)
  3. Error solved. enjoy.

@fmbenhassine fmbenhassine added in: documentation and removed in: core status: waiting-for-triage Issues that we did not analyse yet labels Jan 31, 2020
@fmbenhassine fmbenhassine added this to the 4.3.0 milestone Jan 31, 2020
@fmbenhassine
Copy link
Contributor

This has been asked several times on SO:

And each time people reported that changing the default isolation level (which is ISOLATION_SERIALIZABLE) to a less aggressive value like ISOLATION_READ_COMMITTED or ISOLATION_READ_UNCOMMITTED had resolved the issue. This is explained in details in the reference docs here: https://docs.spring.io/spring-batch/docs/current/reference/html/job.html#txConfigForJobRepository

That said, according to the feedback on SO, other parameters (like the initrans value, the max pool size, etc) may influence the use case.

So I'm closing this issue for now until we are able to reproduce the problem in a reliable way with all parameters included.

@fmbenhassine fmbenhassine removed this from the 4.3.0 milestone Jan 31, 2020
@johnlabarge
Copy link

We have same issue using ISOLAION_READ_COMMITTED and it only fails on the first integration test. Seemingly an indicator that the dummy data solution would work. I wonder why though.

@P-Hartford
Copy link

VERY IMPORTANT NOTE: Oracle DOES NOT support ISOLATION_READ_UNCOMMITTED yet this issue is not mentioned anywhere within the Spring documentation.

I have had this issue sporadically. To the best of my deduction, it seems that somewhere within the default configuration that Spring starts up as soon as it detects the @EnableBatchProcessing annotation is being used when it attempts to create the new batch job, as opposed to the components that I have configured with ISOLATION_READ_COMMITTED as the default isolation level in Oracle is Serializable. The solution mentioned above regarding the use of multiple datasources makes some sense to me for that specific reason.

I am concerned also by the statements I noticed above that indicate that @transactional doesn't play well with batch jobs. As the batch and transaction management capabilities are part of the same framework family that the opposite would be true, that they would be optimized to work together.

As far as the insertion of dummy rows is concerned, in later versions of Oracle, I am currently working with 18c, the default allocation setting is to not allocate space until the first insert takes place. Unfortunately, it appears that when the batch is executed for the first time, it seems to cause a a situation where the batch job is competing with the process that is trying to expand the table size and this causes the failure. I forget the exact name of the setting, but your dba's should be able to adjust those table settings appropriately on the Spring Batch management tables.

I would like to see more specific information on how to configure a separate datasource specifically for the Spring Batch related commits.

@grimch
Copy link

grimch commented Mar 17, 2021

I was able to track down the issue to the use of "PRIMARY KEY" qualifier in the create table statements in combination with Oracle transaction isolation level "SERIALIZABLE".
One can test this with this simple SQL script:
CREATE TABLE test1 (
test_id NUMBER(1) NOT NULL PRIMARY KEY
);

SET TRANSACTION ISOLATION LEVEL SERIALIZABLE;
INSERT INTO test1 VALUES ( 1 );
COMMIT;
Result: ORA-08177 Error.

Root cause:
The problem is not related to Spring Batch code in particular but to the way Oracle is creating tables since Version 11.2 - see here for details: https://oracle-base.com/articles/11g/segment-creation-on-demand-11gr2).

A detailed description of the general issue can be found here: https://asktom.oracle.com/pls/apex/asktom.search?tag=isolation-level-serialization

Solution I used
Adding "SEGMENT CREATION IMMEDIATE" to the create clause above solved the problem:
CREATE TABLE test1 (
test_id NUMBER(1) NOT NULL PRIMARY KEY
)
SEGMENT CREATION IMMEDIATE;

Correspondingly I added the same to all "CREATE TABLE" statements for the Spring Batch job repository and things worked fine.

@fmbenhassine
Copy link
Contributor

@grimch

Thank you very much for this analysis! I'm not expert at oracle, but if this SEGMENT CREATION IMMEDIATE; turns out to be the solution, I would re-open this issue and update the DDL scripts for oracle accordingly.

@P-Hartford

VERY IMPORTANT NOTE: Oracle DOES NOT support ISOLATION_READ_UNCOMMITTED yet this issue is not mentioned anywhere within the Spring documentation.

This would be very important indeed. Do you have any reference to that from the official oracle documentation?

As far as the insertion of dummy rows is concerned, in later versions of Oracle, I am currently working with 18c, the default allocation setting is to not allocate space until the first insert takes place. Unfortunately, it appears that when the batch is executed for the first time, it seems to cause a a situation where the batch job is competing with the process that is trying to expand the table size and this causes the failure.

This seems to be accurate based on the reference shared by @grimch, since the default segment creation is DEFERRED. So I guess that if segment creation is changed to IMMEDIATE as suggested by @grimch, there would be no competition between these two processes at the time of first job instance creation. I will add an end-to-end test with a dockerized oracle instance to confirm that (similar to this one). In the meantime, I would be grateful to anyone who has been through this issue to give this solution a try and share feedback here.

@grimch
Copy link

grimch commented Mar 24, 2021

@benas
Official Oracle documentation depends on the Oracle version used, but from what I could see there are only two isolation levels that one can set interactively:
SET TRANSACTION ISOLATION LEVEL READ COMMITTED
SET TRANSACTION ISOLATION LEVEL SERIALIZABLE

(I don't know if it is allowed to deep-link Oracle documentation here but by searching for "SET TRANSACTION ISOLATION LEVEL" and going to one off the links of the Oracle Help Center (https://docs.oracle.com/en/) will point you to it.)

Note that these isolation levels are for manipulating data. There are "transaction levels" as well for reading data. The respective statements are:
SET TRANSACTION READ ONLY
SET TRANSACTION READ WRITE

So much about doing it interactively, but when using Spring we will usually interact with the database using the java.sql.Connection class either directly or indirectly via some some framework.

to get the background l suggest to have a look at this quite interesting article here, which has a section about ANSI isolation level "READ UNCOMMITTED": https://blogs.oracle.com/oraclemagazine/on-transaction-isolation-levels
It basically says, that Oracle doesn't need "READ UNCOMMITTED" because the situation will not happen in the way the Database works.

The final proof however that Oracle does not support "java.sql.Connection.TRANSACTION_READ_UNCOMMITTED" you can get with some simple Java test class:

import java.sql.*;

public class Test {
	public static void main(String args[]) throws Exception {		
		DriverManager
			.getConnection(args[0], args[1], args[2])
			.setTransactionIsolation(Connection.TRANSACTION_READ_UNCOMMITTED);
	}
}

If you run this with

java -classpath <your_Oracle_jdbc_driver>;. <your_jdbc_url> <your_database_user> <your_passwd>

you will get the following exception:

Exception in thread "main" java.sql.SQLException: READ_COMMITTED and SERIALIZABLE are the only valid transaction levels
        at oracle.jdbc.driver.PhysicalConnection.setTransactionIsolation(PhysicalConnection.java:2862)
        at Test.main(Test.java:7)

QED.

@fmbenhassine
Copy link
Contributor

fmbenhassine commented Apr 15, 2021

@grimch Thank you for your feedback on the supported isolation levels in Oracle, much appreciated!

Based on previous comments, I'm re-opening this issue:

@p1xel-dev
Copy link

Hi @benas, Hi All...

Just adding my 2 cents here. We have dropped the batch tables and created them with the SEGMENT CREATION IMMEDIATE, and the issue is still persisting.

Our Setup so far:

  • Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
  • Changed INITRANS value to 3 to all batch job tables
  • MAXPOOL size of 5 in the JDBC Connection Params
  • Isolation Level: TRANSACTION_READ_COMMITTED
  • Mvn dependency: spring-batch-core:jar:4.3.1:compile

PreparedStatementCallback; SQL [INSERT into BATCH_JOB_INSTANCE(JOB_INSTANCE_ID, JOB_NAME, JOB_KEY, VERSION) values (?, ?, ?, ?)]; ORA-08177: can't serialize access for this transaction ; nested exception is java.sql.SQLException: ORA-08177: can't serialize access for this transaction

We are running out of ideas 😞
Any feedback is more than welcome

@grimch
Copy link

grimch commented Dec 2, 2021

@p1xel-dev
To narrow down the issue you are facing I would suggest to do the following:

a)
Try to insert a record directly into table "batch_job_instance" using SQL Developer, SQL Plus or whatever database client you prefer:

SET TRANSACTION ISOLATION LEVEL SERIALIZABLE;

INSERT
INTO batch_job_instance (
job_instance_id,
job_name,
job_key,
version
) values (
batch_job_seq.NEXTVAL,
'TEST_JOB', 'TEST_JOB_KEY', 1
);
COMMIT;

This is to see if the ORA-08177 error shows up independently of accessing the table via Spring.

b)
In case the error does show up create the test able and insert into it as mentioned by me further above:

CREATE TABLE test1 (
test_id NUMBER(1) NOT NULL PRIMARY KEY
)
SEGMENT CREATION IMMEDIATE;

SET TRANSACTION ISOLATION LEVEL SERIALIZABLE;
INSERT INTO test1 VALUES ( 1 );
COMMIT;

  • If a) doesn't fail then it is probably an issue with Spring and we need to have another look there.

  • If a) fails but b) doesn't then it might be that your table creation for "batch_job_instance" has not happened as expected.

  • If b) fails as well then it is not an issue related to Spring but Oracle specific and needs to be addressed in the respective support forums.

@p1xel-dev
Copy link

p1xel-dev commented Dec 2, 2021

Hi @grimch, Thanks for your answer.

So, I tested a) and it worked correctly using SQLDeveloper.

Screen Shot 2021-12-02 at 11 08 58 AM

@grimch
Copy link

grimch commented Dec 3, 2021

Hi,

it looks to me that you had executed a) in a table wich already had records in it. If this was the case, then the segment already would have been there and no error was to be expected.

If the latter wass the case could you please repeat a) with a fresh set of tables. If you do not want to delete the original ones just modify the respective script and prefix the table names with "x_" for this exercise.

@p1xel-dev
Copy link

Hi @grimch, Thanks for the information.

I tested it again against BATCH_JOB_INSTANCE_X and a new SEQ and it also worked fine. Data stored correctly no issues shown on SQL Developer.

@grimch
Copy link

grimch commented Dec 3, 2021

Hi @p1xel-dev
I am on holiday for a week but when I am back I will reach out to you directly to discuss next steps.
My idea is to replicate the issue on your environment with some minimalistic spring batch application where we keep as many of the default settings as possible.

@p1xel-dev
Copy link

Absolutely. Thank you very much.

Just in case, I haven't stated that this issue happens very rarely. Perhaps between 1 and 3 times a day, on a job that runs every 15 mins.

@grimch
Copy link

grimch commented Dec 3, 2021

One more thing for the moment - also of interest for the bigger audience.
The issue you are facing serms to be different from the problem, which I addressed with my iniitial post:
That one was about ORA-08177 coming up on newly created database schemes.
Yours seems to be different but I am eager to find the root cause never the less :-)

@grimch
Copy link

grimch commented Dec 13, 2021

Here are some considerations about the situation described by @p1xel-dev, which is "ORA-08177: can't serialize access for this transaction" occuring during the day to day spring processing and NOT in initial set-up.

One aspect of this is that the error shows up even if the transaction level has been set to "TRANSACTION_READ_COMMITTED" in the configuration file.
This is the case if for example you are using a configuration class, which is derived from "org.springframework.batch.core.configuration.annotation.DefaultBatchConfigurer".

The reason for this seems to be the following:

  • method "createJobRepository()" in "DefaultBatchConfigurer" creates instance of "org.springframework.batch.core.repository.support.JobRepositoryFactoryBean"-
  • "JobRepositoryFactoryBean" is derrived from "org.springframework.batch.core.repository.support.AbstractJobRepositoryFactoryBean"-
  • The latter has a the "DEFAULT_ISOLATION_LEVEL" = "ISOLATION_SERIALIZABLE".
  • This value is not modified in "DefaultBatchConfigurer" and therefore the isolation level is not changed as far as job repository actions are concerned.

One way to overcome this (may be there are more elegant ones) would be to override the "createJobRepository()" method in your configuration class and to set the "isolationLevelForCreate" property explicitly

@Override
protected JobRepository createJobRepository() throws Exception {
    JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
    factory.setDataSource(this.dataSource);
    factory.setTransactionManager(this.getTransactionManager());
    factory.setIsolationLevelForCreate("ISOLATION_READ_COMMITTED");
    factory.afterPropertiesSet();
    return factory.getObject();
}

@p1xel-dev
Copy link

@grimch Thank you very much for the insights, we have been monitoring the jobs for the last couple of days and I can confirm that overriding the createJobRepository() method and passing the isolation level has fixed the issue we were facing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants