1

Starschema의 JDBC 드라이버를 사용하여 Pentaho를 BigQuery에 연결하고 있습니다. BigQuery에서 Pentaho로 데이터를 가져올 수 있습니다. 그러나 저는 Pentaho에서 BigQuery로 데이터를 쓸 수 없습니다. 행을 BigQuery에 삽입하는 중에 예외가 발생하고 작업이 지원되지 않는 것 같습니다. 어떻게 해결할 수 있습니까?Pentaho : Pentaho에서 BigQuery로 데이터를 쓸 수 없습니다.

오류 메시지 : 당신이 DML 문 (INSERT/UPDATE/DELETE)에 대한 지원이없는 기존의 SQL을 통해이 작업을 수행하려고 할 수있다처럼

2017/10/30 14:27:43 - Table output 2.0 - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : Because of an error, this step can't continue: 
2017/10/30 14:27:43 - Table output 2.0 - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : org.pentaho.di.core.exception.KettleException: 
2017/10/30 14:27:43 - Table output 2.0 - Error inserting row into table [TableID] with values: [A], [I], [G], [1], [2016-02-18], [11], [2016-02-18-12.00.00.123456], [GG], [CB], [132], [null], [null], [null] 
2017/10/30 14:27:43 - Table output 2.0 - 
2017/10/30 14:27:43 - Table output 2.0 - Error inserting/updating row 
2017/10/30 14:27:43 - Table output 2.0 - executeUpdate() 
2017/10/30 14:27:43 - Table output 2.0 - 
2017/10/30 14:27:43 - Table output 2.0 - 
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:385) 
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.processRow(TableOutput.java:125) 
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62) 
2017/10/30 14:27:43 - Table output 2.0 - at java.lang.Thread.run(Unknown Source) 
2017/10/30 14:27:43 - Table output 2.0 - Caused by: org.pentaho.di.core.exception.KettleDatabaseException: 
2017/10/30 14:27:43 - Table output 2.0 - Error inserting/updating row 
2017/10/30 14:27:43 - Table output 2.0 - executeUpdate() 
2017/10/30 14:27:43 - Table output 2.0 - 
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.core.database.Database.insertRow(Database.java:1321) 
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:262) 
2017/10/30 14:27:43 - Table output 2.0 - ... 3 more 
2017/10/30 14:27:43 - Table output 2.0 - Caused by: net.starschema.clouddb.jdbc.BQSQLFeatureNotSupportedException: executeUpdate() 
2017/10/30 14:27:43 - Table output 2.0 - at net.starschema.clouddb.jdbc.BQPreparedStatement.executeUpdate(BQPreparedStatement.java:317) 
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.core.database.Database.insertRow(Database.java:1288) 
2017/10/30 14:27:43 - Table output 2.0 - ... 4 more 
2017/10/30 14:27:43 - BigQuery_rwa-tooling - Statement canceled! 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : Something went wrong while trying to stop the transformation: org.pentaho.di.core.exception.KettleDatabaseException: 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - Error cancelling statement 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - cancel() 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : org.pentaho.di.core.exception.KettleDatabaseException: 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - Error cancelling statement 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - cancel() 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.core.database.Database.cancelStatement(Database.java:750) 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.core.database.Database.cancelQuery(Database.java:732) 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.trans.steps.tableinput.TableInput.stopRunning(TableInput.java:299) 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.trans.Trans.stopAll(Trans.java:1889) 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.trans.step.BaseStep.stopAll(BaseStep.java:2915) 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.trans.steps.tableoutput.TableOutput.processRow(TableOutput.java:139) 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62) 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at java.lang.Thread.run(Unknown Source) 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - Caused by: net.starschema.clouddb.jdbc.BQSQLFeatureNotSupportedException: cancel() 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at net.starschema.clouddb.jdbc.BQStatementRoot.cancel(BQStatementRoot.java:113) 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.core.database.Database.cancelStatement(Database.java:744) 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - ... 7 more 
2017/10/30 14:27:43 - Table output 2.0 - Signaling 'output done' to 0 output rowsets. 
2017/10/30 14:27:43 - BigQuery_prID - No commit possible on database connection [BigQuery_prID] 

답변

1

것 같습니다.

표준 SQL은 DML을 지원하지만 주로 행 지향 삽입과 달리 대량 테이블 조작을 지원합니다. 개별 DML INSERT를 사용하여 데이터를 수집하는 것은 권장되지 않습니다. 자세한 내용은 DML reference documentation의 쿼터를 참조하십시오.

처리를 위해로드 작업을 통해 BigQuery 스트리밍 또는 대량 처리를 사용하는 것이 더 나을 것이지만 이러한 메커니즘은 검색어 언어 밖에 있으므로 JDBC 드라이버를 사용해야 할 수도 있습니다.

관련 문제