MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes
生活随笔
收集整理的這篇文章主要介紹了
MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes
小編覺得挺不錯的,現(xiàn)在分享給大家,幫大家做個參考.
Hive 0.13和HBase 0.98.6.1整合出現(xiàn)錯誤記錄下
hive> CREATE TABLE hbase_table_1(key int, value string) > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'> WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")> TBLPROPERTIES ("hbase.table.name" = "xyz2"); FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:javax.jdo.JDODataStoreException: An exception was thrown while adding/validating class(es) : Specified key was too long; max key length is 767 bytes com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytesat sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)at java.lang.reflect.Constructor.newInstance(Constructor.java:526)at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)at com.mysql.jdbc.Util.getInstance(Util.java:383)at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1062)at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4208)at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4140)at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2597)at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2758)at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2820)at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2769)at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:907)at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:791)at com.jolbox.bonecp.StatementHandle.execute(StatementHandle.java:254)at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:760)at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatementList(AbstractTable.java:711)at org.datanucleus.store.rdbms.table.AbstractTable.create(AbstractTable.java:425)at org.datanucleus.store.rdbms.table.AbstractTable.exists(AbstractTable.java:488)at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:3380)at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:3190)at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2841)at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122)at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:1605)at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:954)at org.datanucleus.store.rdbms.RDBMSStoreManager.getDatastoreClass(RDBMSStoreManager.java:679)at org.datanucleus.store.rdbms.RDBMSStoreManager.getPropertiesForGenerator(RDBMSStoreManager.java:2045)at org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1365)at org.datanucleus.ExecutionContextImpl.newObjectId(ExecutionContextImpl.java:3827)at org.datanucleus.state.JDOStateManager.setIdentity(JDOStateManager.java:2571)at org.datanucleus.state.JDOStateManager.initialiseForPersistentNew(JDOStateManager.java:513)at org.datanucleus.state.ObjectProviderFactoryImpl.newForPersistentNew(ObjectProviderFactoryImpl.java:232)at org.datanucleus.ExecutionContextImpl.newObjectProviderForPersistentNew(ExecutionContextImpl.java:1414)at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2218)at org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:2065)at org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextImpl.java:1913)at org.datanucleus.ExecutionContextThreadedImpl.persistObject(ExecutionContextThreadedImpl.java:217)at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:727)at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752)at org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:719)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)at com.sun.proxy.$Proxy9.createTable(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1261)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1294)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)at com.sun.proxy.$Proxy10.create_table_with_environment_context(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:558)at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:547)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)at com.sun.proxy.$Proxy11.createTable(Unknown Source)at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613)at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4189)at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281)at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1503)at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270)at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:792)at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at org.apache.hadoop.util.RunJar.main(RunJar.java:212)at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:451)at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:732)at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752)at org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:719)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)at com.sun.proxy.$Proxy9.createTable(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1261)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1294)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)at com.sun.proxy.$Proxy10.create_table_with_environment_context(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:558)at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:547)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)at com.sun.proxy.$Proxy11.createTable(Unknown Source)at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613)at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4189)at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281)at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1503)at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270)at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:792)at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at org.apache.hadoop.util.RunJar.main(RunJar.java:212) NestedThrowablesStackTrace: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytesat sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)at java.lang.reflect.Constructor.newInstance(Constructor.java:526)at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)at com.mysql.jdbc.Util.getInstance(Util.java:383)at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1062)at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4208)at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4140)at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2597)at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2758)at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2820)at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2769)at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:907)at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:791)at com.jolbox.bonecp.StatementHandle.execute(StatementHandle.java:254)at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:760)at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatementList(AbstractTable.java:711)at org.datanucleus.store.rdbms.table.AbstractTable.create(AbstractTable.java:425)at org.datanucleus.store.rdbms.table.AbstractTable.exists(AbstractTable.java:488)at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:3380)at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:3190)at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2841)at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122)at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:1605)at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:954)at org.datanucleus.store.rdbms.RDBMSStoreManager.getDatastoreClass(RDBMSStoreManager.java:679)at org.datanucleus.store.rdbms.RDBMSStoreManager.getPropertiesForGenerator(RDBMSStoreManager.java:2045)at org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1365)at org.datanucleus.ExecutionContextImpl.newObjectId(ExecutionContextImpl.java:3827)at org.datanucleus.state.JDOStateManager.setIdentity(JDOStateManager.java:2571)at org.datanucleus.state.JDOStateManager.initialiseForPersistentNew(JDOStateManager.java:513)at org.datanucleus.state.ObjectProviderFactoryImpl.newForPersistentNew(ObjectProviderFactoryImpl.java:232)at org.datanucleus.ExecutionContextImpl.newObjectProviderForPersistentNew(ExecutionContextImpl.java:1414)at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2218)at org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:2065)at org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextImpl.java:1913)at org.datanucleus.ExecutionContextThreadedImpl.persistObject(ExecutionContextThreadedImpl.java:217)at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:727)at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752)at org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:719)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)at com.sun.proxy.$Proxy9.createTable(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1261)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1294)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)at com.sun.proxy.$Proxy10.create_table_with_environment_context(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:558)at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:547)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)at com.sun.proxy.$Proxy11.createTable(Unknown Source)at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613)at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4189)at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281)at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1503)at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270)at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:792)at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at org.apache.hadoop.util.RunJar.main(RunJar.java:212) )原因:
MySQL的varchar主鍵只支持不超過768個字節(jié) 或者 768/2=384個雙字節(jié) 或者 768/3=256個三字節(jié)的字段?
而 GBK是雙字節(jié)的,UTF-8是三字節(jié)的。
查詢:
mysql> show variables like '%char%'; +--------------------------+--------------------------------------------+ | Variable_name | Value | +--------------------------+--------------------------------------------+ | character_set_client | gbk | | character_set_connection | gbk | | character_set_database | utf8 | | character_set_filesystem | binary | | character_set_results | gbk | | character_set_server | utf8 | | character_set_system | utf8 | | character_sets_dir | D:\soft\REDMIN~1.2-0\mysql\share\charsets\ | +--------------------------+--------------------------------------------+ 8 rows in set (0.05 sec)詳細(xì)的方法:http://blog.csdn.net/cindy9902/article/details/6215769
解決:
?(1) 最簡單的修改方法,就是修改mysql的my.ini文件中的字符集鍵值,
如 ?? default-character-set = utf8
????? character_set_server =? utf8
?? 修改完后,重啟mysql的服務(wù),service mysql restart
?? 使用 mysql> SHOW VARIABLES LIKE 'character%';查看
?(2) 還有一種修改字符集的方法,就是使用mysql的命令
mysql> SET character_set_client = utf8 ; Query OK, 0 rows affected (0.02 sec)mysql> SET character_set_connection = utf8 ; Query OK, 0 rows affected (0.00 sec)mysql> SET character_set_database = utf8 ; Query OK, 0 rows affected (0.00 sec)mysql> SET character_set_results = utf8 ; Query OK, 0 rows affected (0.00 sec)mysql> SET character_set_server = utf8 ; Query OK, 0 rows affected (0.00 sec)mysql> SET collation_connection = utf8 ; ERROR 1273 (HY000): Unknown collation: 'utf8' mysql> SET collation_database = utf8 ; ERROR 1273 (HY000): Unknown collation: 'utf8' mysql> SET collation_server = utf8 ; ERROR 1273 (HY000): Unknown collation: 'utf8' mysql> SET NAMES 'utf8'; Query OK, 0 rows affected (0.00 sec)查詢修改后的字符集:
mysql> SHOW VARIABLES LIKE 'character%'; +--------------------------+--------------------------------------------+ | Variable_name | Value | +--------------------------+--------------------------------------------+ | character_set_client | utf8 | | character_set_connection | utf8 | | character_set_database | utf8 | | character_set_filesystem | binary | | character_set_results | utf8 | | character_set_server | utf8 | | character_set_system | utf8 | | character_sets_dir | D:\soft\REDMIN~1.2-0\mysql\share\charsets\ | +--------------------------+--------------------------------------------+ 8 rows in set (0.00 sec)
總結(jié)
以上是生活随笔為你收集整理的MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: Hive 0.13.1 和HBase 0
- 下一篇: hive.ql.exec.DDLTask