Jump to content

Lucas Scabora

Members
  • Content count

    3
  • Joined

  • Last visited

Community Reputation

0 Neutral

About Lucas Scabora

  • Rank
    Newbie
  1. Lucas Scabora

    Bulk dynamic insert operations into an index-organized table

    Hello, thanks for all the help. I think that will be better to export the tables as CSV files using this (your) tutorial: http://www.dba-oracle.com/t_export table_to_csv.htm If the data are into files, I usually sort each file's lines directly in bash using the following tutorial (splitting according to the available memory): https://unix.stackexchange.com/questions/350046/sort-large-csv-files-90gb-disk-quota-exceeded Thanks again for everything. If I come up with a solution inside Oracle I will let you know. Best regards.
  2. Lucas Scabora

    Bulk dynamic insert operations into an index-organized table

    Hello, Thanks for the advice. I'm loading an IOT from another table already existing in Oracle.The PLSQL procedure tries to transform the following table:id , column-------, --------------1 , A1 , B1 , C2 , Xinto the following table (as IOT):id , column1 , column2-------, -------------- , ------------------1 , A , B 1 , C , NULL2 , X , NULLSo the procedure manages a cursor that sorts the first table by its id, and iterates over the rows trying to fill the row of the resulting table. This is performed by dynamically creating a string corresponding to the VALUES clause of an insert operation. When the id changes or the number of columns exceeds N, I insert the current row (EXECUTE IMMEDIATE) and start a new one. Additionally, I added a sequential number with the id to compose the PK due to its unique constraint.Moreover, according to the value of N (in the example, N = 2), I will perform a PIVOT operation limiting the number of columns. If the number of rows (with the same id) is greater than N, other rows are allocated to store that. The only index/constraint is the primary key, which I think is required for the IOT approach.Following your advice, I will test the parallel and append hints and use the NOLOGGING feature. Moreover, I will further evaluate the possible database's bottlenecks with the AWR report.While evaluating the query processing, it is mostly I/O bound. Unfortionally I cannot alter the hardware of the machine.A (partial) solution that I adopted is generating a CSV file with the resulting table and loading it with SQLLDR. It works, but generating more than 20GB of data and copying it into the database seems redundant.
  3. Hello, I implemented a PLSQL procedure to create and populate an index-organized table. The procedure dynamically creates N columns for the intended table (whose value is informed by a parameter). For the rows, I based myself on an existing (huge) table (approximately 1,468,365,182 rows). For that, I created a cursor to manipulate one row at a time. The objective here is similar to the PIVOT operation, combining multiple rows as additional columns. The difference is that I combine these columns up to N columns, and the extra columns will be allocated in additional rows (as required). The PLSQL mounts each row following the criterion above and then inserts it in the index-organized table, however, the required time for the procedure is impractical (up to some days). I also tried to adapt the procedure to use FORALL and BULK COLLECT INTO, however, since I dynamically built each tuple such approach failed. Another approach was to disable commits during the insert operations, however, the time continued to be impractical. Is there any other approach to perform a huge amount of insert operations into an index-organized table in an efficient way?
×