Updating clob data
If at least one row in the table has blob data that is longer than 32K (which means that Derby will stream it, so far as I can tell), then the error will be: ERROR XCL30: An IOException was thrown when reading a 'BLOB' from an Input Stream. Surprisingly, it doesn't (appear to) matter what the trigger statement is actually doing--so long as it references the blob column at least once, one of these two errors will occur, depending on the length of the data.And if the data is greater than 32k, then the error will happen regardless of what the trigger does or whether or not it references the blob column.
Oracle Client libraries so far (to write smaller amounts of data) and it has worked out quite well for us.
In older Java projects I've been writing and reading CLOBS and I know you have to handle those fields a bit different than normal VARCHARs and such. NET Dynamic Help and whatever I could find on the Internet, and the funny thing is I found three different ways of doing it; 2 really simple and one not so simple.
That said, there are really four 'to-do's with this issue: 1 - document the current restrictions 2 - with the current restriction produce a better error message 3 - implement LOB support in the referenced tables.
4 - Fix triggers so that they work in cases where the triggered-SQL-statement does reference a blob column (currently, a trigger will fail with an IOException if the target table has blob data larger than 32K, even if that column isn't actually referenced by the trigger action).
The maximum length of TINYTEXT has 255 character (8 bits), TEXT has 16,535 character (16 bits), MEDIUMTEXT has 16,777,216 character (24 bits) and LONGTEXT has 4,294,967,295 character (32 bits).