Recently I was given a seed data file that was generated from SQLcl with a bunch of INSERT statements, and it worked fine in SQL Developer when ran but would fail in SQLcl with the following:


Error starting at line : 3 File @ file:/Users/neiferna/bug/DEMO_DATA_TABLE.sql
In command -
Insert into DEMO (ID,DISPLAY_NAME) values (2,'multiline feedback

Error at Command Line : 3 Column : 46 File @ file:/Users/neiferna/bug/DEMO_DATA_TABLE.sql
Error report -
SQL Error: ORA-01756: quoted string not properly terminated
01756. 00000 -  "quoted string not properly terminated"
*Cause:    A quoted string must be terminated with a single quote
           mark (').
*Action:   Insert the closing quote and retry the statement.

More Details :
  1* Insert into DEMO (ID,DISPLAY_NAME) values (2,'multiline feedback
SP2-0734: unknown command beginning "2')..." - rest of line ignored.

SP2-0223: No lines in SQL Buffer.SQL>

It is common to track seed/sample data scripts, as opposed dump files. These scripts are used in releases and easy to maintain for the developer and tracked.

I went ahead and created a base case for the failure and realized that it was failing on any statements which had newlines in the INSERT. Here is a sample file base case:

Insert into DEMO (ID,DISPLAY_NAME) values (2,'multiline feedback




Shoutout to Martin D'Souza for the tip. All you need to do is the following:

set sqlblanklines on;

Problem solved.

Here is an example of the working insert with newlines:

SQL> set sqlblanklines on;

1 row inserted.

SQL> commit;

Commit complete.


Other format considerations

  • CSV
  • JSON and other formats not supported by LOAD


Using SQLcl unload/load with CSV yielded a similar error due to the newlines:

SQL> set loadformat csv
SQL> unload demo

format csv

column_names on
delimiter ,
enclosures ""
encoding UTF8
row_terminator default

** UNLOAD Start ** at 2024.01.16-17.09.51
Export Separate Files to /Users/neiferna/bug
File Name: /Users/neiferna/bug/DEMO_DATA_TABLE.csv
Number of Rows Exported: 1
** UNLOAD End ** at 2024.01.16-17.09.52
SQL> truncate table demo;

Table DEMO truncated.

SQL> load table demo DEMO_DATA_TABLE.csv


column_names on
delimiter ,
enclosures ""
double off
encoding UTF8
row_limit off
row_terminator default
skip_rows 0

#ERROR Insert failed for row  1
#ERROR --Line contains invalid enclosed character data or delimiter at position 21
#ERROR Row 1 data follows:
2,"multiline feedback
#ERROR Insert failed for row  5
#ERROR --Line contains invalid enclosed character data or delimiter at position 2
#ERROR Row 5 data follows:
#INFO Number of rows processed: 5
#INFO Number of rows in error: 2
#INFO No rows committed
WARNING: Processed with errors

I played around with different delimeter and loadformat settings but still no luck. I will update once I hear a solution on this one.

JSON/Other formats

You can UNLOAD to JSON and some other formats, but there is no native way to LOAD back in yet. Only CSV/DELIMITED can. And INSERT format can also just be ran since it is a .sql file that is produced.

SQL> help set loadformat
SET LOADFORMAT [ default|csv|delimited|html|insert|json|json-formatted|loader|t2|xml] [options...]

  default        : Load format properties return to default values
  csv            : comma separated values
  delimited      : (CSV synonym) delimited format, comma separated values by default
  html           : UNLOAD only, Hypertext Markup Language
  insert         : UNLOAD only, SQL insert statements
  json           : UNLOAD only, Java Script Object Notation
  json-formatted : UNLOAD only, "pretty" formatted JSON
  loader         : UNLOAD only, Oracle SQLLoader format
  t2             : UNLOAD only, T2 Metrics
  xml            : UNLOAD only, Extensible Markup Language

Quite often developers track seed/sample data, and JSON may be preferred for some. Maybe this will be a native load format eventually that can be loaded in with LOAD - but for now here is one of the many methods to load that file in. In the past I have used a bash script that takes the JSON file locally, builds an array of strings .sql file that eventually created a clob and you can work with that clob in SQL. But below is something small and a little simpler to get going.

Here is an example of what gets produced from set loadformat json

    "results": [
            "columns": [
                    "name": "ID",
                    "type": "NUMBER"
                    "name": "DISPLAY_NAME",
                    "type": "VARCHAR2"
            "items": [
                    "id": 2,
                    "display_name": "multiline feedback\n\n1\n\n2"
Example JSON format from UNLOAD (with 1 row)

Inserting using JSON

There are various ways to get a file into the database. Examples: PUT into a RESTful ORDS service and store in a table, etc. The following example shows how to use an OCI bucket as a temporary data storage to load the file from you desktop then read in the database.

1. Load the export into a Bucket in Object Storage (examples can be found here)

SQL> set loadformat json
SQL> oci profile PERSONAL
Region set to: us-ashburn-1
OCI Profile set to PERSONAL
Transfer method set to oci
SQL> cs<REMOVED>/b/demo-image-upload
DBMS_CLOUD Credential: Not Set
Transfer Method: oci
URI as specified:<REMOVED>/b/demo-image-upload
SQL> unload demo CS

DBMS_CLOUD Credential: Not Set
Transfer Method: oci
URI as specified:<REMOVED>/b/demo-image-upload
Final URI:<REMOVED>/b/demo-image-upload

format json

** UNLOAD Start ** at 2024.01.22-23.17.10
Export Separate Files to<REMOVED>/b/demo-image-upload
File Name: DEMO_DATA_TABLE_2.json
Number of Rows Exported: 1
** UNLOAD End ** at 2024.01.22-23.17.11

2. Pull the file with apex_web_service.make_rest_request and insert with the help of JSON_TABLE

    l_clob clob;
     l_clob := apex_web_service.make_rest_request(
        p_url => '<REMOVED>/b/demo-image-upload/DEMO_DATA_TABLE_2.json',
        p_http_method => 'GET');
    insert into demo(
    select *
    from json_table(l_clob, '$.results[*].items[*]'
        id number path '$.id',
        display_name varchar2(4000) path '$.display_name'
JSON insert script

You just need to modify the columns to fit your needs, and I am sure this insert/select can be dynamically created based on the results.columns in the JSON.