Use sql loader to load data into database.
2011-02-23 16:09
330 查看
SQL Loader is the utility to use for high performance data loads.
The SQL Loader control file contains information that describes how the data will be loaded ,Which contains the table name, column datatypes, field delimiters, etc.
See example:
$more account_data.csv
8,"DZcLHm6H2iDNnvjU7TafFA==","53746a61-aeaa-4856-ba52-932fbb6b8161","nZAylFNJOOY=","DZcLHm6H2iDNnvjU7TafFA==",0,"2010-08-12",90,1
9,"5nBoqMWLF+w=","53746a61-aeaa-4856-ba52-932fbb6b8190","vpluH45reHw=","QS9PWc3XtBY83LkgSkGJJA==",0,"2010-08-12",90,1
$ more account.ctl
OPTIONS(BINDSIZE=8388608,READSIZE=8388608,ERRORS=-1,ROWS=1)
LOAD DATA
INFILE '/henry/account_data.csv'
APPEND INTO TABLE account
FIELDS TERMINATED BY ',' Optionally enclosed by '"'
(
id ,
field3 ,
field4 ,
field2 ,
field1 ,
field5 ,
field6 date 'yyyy-mm-dd',
field7 ,
field8
)
then run the command from bash:
sqlldr userid=pv/pv control=account_data.ctl log=account_data.log
view account_data.log
SQL*Loader: Release 11.2.0.2.0 - Production on Tue Feb 22 22:15:41 2011
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Control File: account_data.ctl
Data File: /henry/account_data.csv
Bad File: account_data.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: ALL
Bind array: 1 rows, maximum of 8388608 bytes
Continuation: none specified
Path used: Conventional
Table ACCOUNT, loaded from every logical record.
Insert option in effect for this table: APPEND
Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
ID FIRST * , O(") CHARACTER
FIELD3 NEXT * , O(") CHARACTER
FIELD4 NEXT * , O(") CHARACTER
FIELD2 NEXT * , O(") CHARACTER
FIELD1 NEXT * , O(") CHARACTER
FIELD5 NEXT * , O(") CHARACTER
FIELD6 NEXT * , O(") DATE yyyy-mm-dd
FIELD7 NEXT * , O(") CHARACTER
FIELD8 NEXT * , O(") CHARACTER
Table ACCOUNT:
4 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 2313 bytes(1 rows)
Read buffer bytes: 8388608
Total logical records skipped: 0
Total logical records read: 4
Total logical records rejected: 0
Total logical records discarded: 0
Run began on Tue Feb 22 22:15:41 2011
Run ended on Tue Feb 22 22:15:41 2011
Elapsed time was: 00:00:00.32
CPU time was: 00:00:00.05
~
~
~
The SQL Loader control file contains information that describes how the data will be loaded ,Which contains the table name, column datatypes, field delimiters, etc.
See example:
$more account_data.csv
8,"DZcLHm6H2iDNnvjU7TafFA==","53746a61-aeaa-4856-ba52-932fbb6b8161","nZAylFNJOOY=","DZcLHm6H2iDNnvjU7TafFA==",0,"2010-08-12",90,1
9,"5nBoqMWLF+w=","53746a61-aeaa-4856-ba52-932fbb6b8190","vpluH45reHw=","QS9PWc3XtBY83LkgSkGJJA==",0,"2010-08-12",90,1
$ more account.ctl
OPTIONS(BINDSIZE=8388608,READSIZE=8388608,ERRORS=-1,ROWS=1)
LOAD DATA
INFILE '/henry/account_data.csv'
APPEND INTO TABLE account
FIELDS TERMINATED BY ',' Optionally enclosed by '"'
(
id ,
field3 ,
field4 ,
field2 ,
field1 ,
field5 ,
field6 date 'yyyy-mm-dd',
field7 ,
field8
)
then run the command from bash:
sqlldr userid=pv/pv control=account_data.ctl log=account_data.log
view account_data.log
SQL*Loader: Release 11.2.0.2.0 - Production on Tue Feb 22 22:15:41 2011
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Control File: account_data.ctl
Data File: /henry/account_data.csv
Bad File: account_data.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: ALL
Bind array: 1 rows, maximum of 8388608 bytes
Continuation: none specified
Path used: Conventional
Table ACCOUNT, loaded from every logical record.
Insert option in effect for this table: APPEND
Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
ID FIRST * , O(") CHARACTER
FIELD3 NEXT * , O(") CHARACTER
FIELD4 NEXT * , O(") CHARACTER
FIELD2 NEXT * , O(") CHARACTER
FIELD1 NEXT * , O(") CHARACTER
FIELD5 NEXT * , O(") CHARACTER
FIELD6 NEXT * , O(") DATE yyyy-mm-dd
FIELD7 NEXT * , O(") CHARACTER
FIELD8 NEXT * , O(") CHARACTER
Table ACCOUNT:
4 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 2313 bytes(1 rows)
Read buffer bytes: 8388608
Total logical records skipped: 0
Total logical records read: 4
Total logical records rejected: 0
Total logical records discarded: 0
Run began on Tue Feb 22 22:15:41 2011
Run ended on Tue Feb 22 22:15:41 2011
Elapsed time was: 00:00:00.32
CPU time was: 00:00:00.05
~
~
~
相关文章推荐
- 94.You plan to move data from a flat file to a table in your database. You decide to use SQL*Loader
- failed to load sql modules into the database cluster
- How to use Scala on Spark to load data into Hbase/MapRDB -- normal load or bulk load.
- Failed to load sql modules into the database cluster during PostgreSQL Installation
- How To Load CLOB Data from a File into a CLOB column using PL/SQL
- How to load data into SAP HANA database
- Use SqlDataAdapter to update database, but SqlCommandBuilder required
- use stack to initial a treeview with the data in a database
- Import MySQL Dumpfile, SQL Datafile Into My Database
- Top 10 steps to optimize data access in SQL Server: Part IV (Diagnose database performance problems)
- Designing Data Storage Architecture- How to Sync Large SQL Server Databases to SQL Azure
- [Solved]: System.Data.SqlClient.SqlError: Exclusive access could not be obtained because the database is in use.
- BW--ABAP code using BAPI's to load data into Cube from SpreadSheets
- Top 10 steps to optimize data access in SQL Server: Part V (Optimize database files and apply partitioning)
- How to use SQLPlus export data to csv format
- Java Exception : Fatal Error. Unable to initialize DatabaseMetaData class.和Non SQL Error : Could not load class com.mysql.jdbc.D
- To upgrade the project database to use SQL Server LocalDB Express, …
- How to use batching to improve SQL Database application performance
- Import Data from csv file to Azure SQL DATABASE
- a universal class to complete import data from an excel file into a database