1. JDBC Batching
2. PLSQL bulk binds
Whats important though is that we only commit after X amount of records, and only go to the database when we are ready to insert those X amount of records in a single round trip. To me PLSQL bulk binds make sense here, as all it takes is to convert those cache entries into an SQL object which the database can interpret from PLSQL. Once that's done your PLSQL is as simple as this really.
FORALL i IN mid.FIRST..mid.LAST
insert into messages (message_id, message_type, message)
values (mid(i), mt(i), m(i));
COMMIT;
I will post a full example on this shortly which also shows how to read the data back into the cache.
More about this can be found here on what I want to setup. My initial testing showed 1,000,000 records inserted without too much issues BUT there are a couple of things to be aware from coherence and oracle itself.
http://coherence.oracle.com/display/COH35UG/Read-Through%2C+Write-Through%2C+Write-Behind+and+Refresh-Ahead+Caching
4 comments:
Could you pelase post the example which you mentioned. Thanks
Took me a while to get this going but if you have access My Oracle Support you can download / run the demo as follows
Write-Behind Caching Using An Oracle Database and PLSQL Bulk Binds (Doc ID 970104.1)
PLSQL Bulk binds through JDBC involves a little more code but worth the effort if your inserting large/frequent amounts of records through a cache store.
It's all documented in the note above.
Hi, is the above document still available? any example code would be much appreciated
You have to log into oracle support portal for this code.
Post a Comment