-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Parameter types improvement #2122
Comments
Dave Syer commented I see the value in being able to inject a strongly typed object into a step component (like a reader). But I don't think it justifies a new XML namespace feature because you can quite easily accomplish the basic injection use case with Spring 3, e.g.
Or am I missing something? |
Stéphane Nicoll commented if you feel at ease with the XML config, yes. If you don't or you want to abstract the mapping, it's a different story. Say that a a company has a dozens strong types and the associated ParameterMapper implementations. One guy creates the list and an abstract Job bean with a ref to it and you're done. |
Dave Syer commented What kind of parameter mappers would be generic enough to be valuable then (not the fooDao-based one in my example, I guess)? |
Stéphane Nicoll commented None and I guess that's the whole point. If you're happy with String, id as long, simple date and a numeric value you don't need this. If you need a more complex object in input that you can identify with an ID, then the mapper implementation is just something that is too specific to be generic. The only mapper I'd have in mind is something in line with the JpaItemWriter and HibernateWriter. JpaParameterMapper would take the ID of a persistent object and its target class and would return the persistent object instead of the ID. HibernateParameterMapper would do the same with a session factory. This is our primary use case (using persistent objects as parameter instead of a raw id). Other use cases are possible like injecting a java.io.File based on a file path. |
Stéphane Nicoll commented Injecting a Spring's Resource object would also be a good example. |
Dave Syer commented Spring uses its own type conversion mechanisms for this (in particular the Resource has its own PropertyEditor). I would guess that if you use a custom ConversionService in Spring 3.0 you can avoid the extra JobParameters mapper entirely. Please try it and let us know if that causes problems. |
Closing this issue, based on the last comment. |
This commit also changes the way job parameters are parsed and persisted. NB: This commit should ideally have been split into two change sets. But the changes are tightly related that is was not possible to separate them. Related to: * spring-projects#3960 * spring-projects#2122 * spring-projects#1035 * spring-projects#1983
This commit also changes the way job parameters are parsed and persisted. NB: This commit should ideally have been split into two change sets. But the changes are tightly related that is was not possible to separate them. Related to: * spring-projects#3960 * spring-projects#2122 * spring-projects#1035 * spring-projects#1983
This commit also changes the way job parameters are parsed and persisted. NB: This commit should ideally have been split into two change sets. But the changes are tightly related that is was not possible to separate them. Related to: * spring-projects#3960 * spring-projects#2122 * spring-projects#1035 * spring-projects#1983
Re-opened as part of #3960 . |
Resolved with #4204 . |
Stéphane Nicoll opened BATCH-1461 and commented
Spring Batch does not offer a way to define a custom parameter type. The reason behind it is that they are persisted in the database and require some kind of pre-defined types. Right now String, long, double and Date are supported. A first improvement would allow any batch job developer to plug ParameterMapper implementation(s). Here is a proposal for the contract of the ParameterMapper interface:
The parameter mapper is only used to manipulate a more complex object during the batch job execution instead of a primitive/raw value. It's not meant to transform the complex value to the primitive type.
The mappers could be defined on the job like this
A default mapper implementation can be added at the end of the chain to return the actual primitive type if no custom mapper did map it.
One concrete example of this is a business object identified by an ID. The job parameter would be initially the ID but the job would actually use the actual business object while executing the batch. The ParameterType interface may need to be renamed to something more explicit since the JobParameter could hold other data types in this case.
The advantage of this approach is that it does not break command line executions for instance since the raw ID is still used as input. It is also persistent-friendly.
Affects: 2.1.0.M3
The text was updated successfully, but these errors were encountered: