久久久久久久av_日韩在线中文_看一级毛片视频_日本精品二区_成人深夜福利视频_武道仙尊动漫在线观看

    <legend id='kb5Hz'><style id='kb5Hz'><dir id='kb5Hz'><q id='kb5Hz'></q></dir></style></legend>
  • <tfoot id='kb5Hz'></tfoot>
      <bdo id='kb5Hz'></bdo><ul id='kb5Hz'></ul>
      1. <i id='kb5Hz'><tr id='kb5Hz'><dt id='kb5Hz'><q id='kb5Hz'><span id='kb5Hz'><b id='kb5Hz'><form id='kb5Hz'><ins id='kb5Hz'></ins><ul id='kb5Hz'></ul><sub id='kb5Hz'></sub></form><legend id='kb5Hz'></legend><bdo id='kb5Hz'><pre id='kb5Hz'><center id='kb5Hz'></center></pre></bdo></b><th id='kb5Hz'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='kb5Hz'><tfoot id='kb5Hz'></tfoot><dl id='kb5Hz'><fieldset id='kb5Hz'></fieldset></dl></div>

        <small id='kb5Hz'></small><noframes id='kb5Hz'>

        Confluent JDBC Source 連接器的問題

        Issue with Confluent JDBC Source connector(Confluent JDBC Source 連接器的問題)
          <bdo id='oXm3j'></bdo><ul id='oXm3j'></ul>

          1. <tfoot id='oXm3j'></tfoot>
            <legend id='oXm3j'><style id='oXm3j'><dir id='oXm3j'><q id='oXm3j'></q></dir></style></legend>
          2. <small id='oXm3j'></small><noframes id='oXm3j'>

                <tbody id='oXm3j'></tbody>

              1. <i id='oXm3j'><tr id='oXm3j'><dt id='oXm3j'><q id='oXm3j'><span id='oXm3j'><b id='oXm3j'><form id='oXm3j'><ins id='oXm3j'></ins><ul id='oXm3j'></ul><sub id='oXm3j'></sub></form><legend id='oXm3j'></legend><bdo id='oXm3j'><pre id='oXm3j'><center id='oXm3j'></center></pre></bdo></b><th id='oXm3j'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='oXm3j'><tfoot id='oXm3j'></tfoot><dl id='oXm3j'><fieldset id='oXm3j'></fieldset></dl></div>

                  本文介紹了Confluent JDBC Source 連接器的問題的處理方法,對大家解決問題具有一定的參考價值,需要的朋友們下面隨著小編來一起學(xué)習(xí)吧!

                  問題描述

                  我在嘗試將 confluent 用于 kafka-connect 時遇到以下問題.我正在嘗試按照 https://www.confluent.io/blog/simplest-useful-kafka-connect-data-pipeline-world-thereabouts-part-1/

                  I'm getting the below issue while trying to use the confluent for kafka-connect. I'm trying to follow the demo given at https://www.confluent.io/blog/simplest-useful-kafka-connect-data-pipeline-world-thereabouts-part-1/

                  錯誤:

                  ./bin/confluent load jdbc_source_mysql_foobar_01 -d /tmp/kafka-connect-jdbc-source.json
                  {
                  This CLI is intended for development only, not for production
                  https://docs.confluent.io/current/cli/index.html
                  
                  {
                    "error_code": 400,
                    "message": "Connector configuration is invalid and contains the following 2 error(s):\nInvalid value java.sql.SQLException: No suitable driver found for jdbc:mysql://localhost:3306/demo?user=root&password=tiger for configuration Couldn't open connection to jdbc:mysql://localhost:3306/demo?user=root&password=tiger\nInvalid value java.sql.SQLException: No suitable driver found for jdbc:mysql://localhost:3306/demo?user=root&password=tiger for configuration Couldn't open connection to jdbc:mysql://localhost:3306/demo?user=root&password=tiger\nYou can also find the above list of errors at the endpoint `/{connectorType}/config/validate`"
                  }
                  

                  推薦答案

                  Message No fit driver found for 表示找不到合適的 jdbc 驅(qū)動程序.

                  Message No suitable driver found for means, that proper jdbc driver cannot be found.

                  根據(jù)文章要修復(fù)上述錯誤,您需要在 share/java/kafka-connect-jdbc

                  According to article to fix above error you need to place appropriate jdbc driver in share/java/kafka-connect-jdbc

                  要使用 JDBC 連接器,您需要為源數(shù)據(jù)庫提供相關(guān)的 JDBC 驅(qū)動程序.連接器附帶了 PostgreSQL 和 sqlite 的驅(qū)動程序——對于所有其他人,下載適當(dāng)?shù)?JAR 并將其放在 share/java/kafka-connect-jdbc 中.您可以在此處找到 MySQL、Oracle、SQL Server、DB2 和 Teradata 的相關(guān)下載.

                  To use the JDBC connector, you’ll need to make available the relevant JDBC driver for your source database. The connector ships with drivers for PostgreSQL and sqlite—for all others download the appropriate JAR and place it in share/java/kafka-connect-jdbc. You can find the relevant downloads here for MySQL, Oracle, SQL Server, DB2, and Teradata.

                  復(fù)制jar后,需要重啟Kafka Connect

                  After copying jars, you need to restart Kafka Connect

                  這篇關(guān)于Confluent JDBC Source 連接器的問題的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,也希望大家多多支持html5模板網(wǎng)!

                  【網(wǎng)站聲明】本站部分內(nèi)容來源于互聯(lián)網(wǎng),旨在幫助大家更快的解決問題,如果有圖片或者內(nèi)容侵犯了您的權(quán)益,請聯(lián)系我們刪除處理,感謝您的支持!

                  相關(guān)文檔推薦

                  How to use windowing functions efficiently to decide next N number of rows based on N number of previous values(如何有效地使用窗口函數(shù)根據(jù) N 個先前值來決定接下來的 N 個行)
                  reuse the result of a select expression in the quot;GROUP BYquot; clause?(在“GROUP BY中重用選擇表達式的結(jié)果;條款?)
                  Does ignore option of Pyspark DataFrameWriter jdbc function ignore entire transaction or just offending rows?(Pyspark DataFrameWriter jdbc 函數(shù)的 ignore 選項是忽略整個事務(wù)還是只是有問題的行?) - IT屋-程序員軟件開發(fā)技
                  Error while using INSERT INTO table ON DUPLICATE KEY, using a for loop array(使用 INSERT INTO table ON DUPLICATE KEY 時出錯,使用 for 循環(huán)數(shù)組)
                  pyspark mysql jdbc load An error occurred while calling o23.load No suitable driver(pyspark mysql jdbc load 調(diào)用 o23.load 時發(fā)生錯誤 沒有合適的驅(qū)動程序)
                  How to integrate Apache Spark with MySQL for reading database tables as a spark dataframe?(如何將 Apache Spark 與 MySQL 集成以將數(shù)據(jù)庫表作為 Spark 數(shù)據(jù)幀讀取?)

                    <tfoot id='r10MA'></tfoot>
                        <tbody id='r10MA'></tbody>

                        <bdo id='r10MA'></bdo><ul id='r10MA'></ul>

                        • <legend id='r10MA'><style id='r10MA'><dir id='r10MA'><q id='r10MA'></q></dir></style></legend>
                        • <i id='r10MA'><tr id='r10MA'><dt id='r10MA'><q id='r10MA'><span id='r10MA'><b id='r10MA'><form id='r10MA'><ins id='r10MA'></ins><ul id='r10MA'></ul><sub id='r10MA'></sub></form><legend id='r10MA'></legend><bdo id='r10MA'><pre id='r10MA'><center id='r10MA'></center></pre></bdo></b><th id='r10MA'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='r10MA'><tfoot id='r10MA'></tfoot><dl id='r10MA'><fieldset id='r10MA'></fieldset></dl></div>

                          <small id='r10MA'></small><noframes id='r10MA'>

                            主站蜘蛛池模板: 婷婷综合色 | 欧美一级淫片免费视频黄 | 国产成人精品区一区二区不卡 | 视频在线观看亚洲 | 国产精品一码二码三码在线 | 久久久性色精品国产免费观看 | 国产一二三区精品视频 | 天天干天天干 | 日韩高清一区 | 国产精品亚洲成在人线 | 亚洲国产欧美在线 | 亚洲第一视频 | 国产91久久精品一区二区 | 国产精品精品久久久 | 福利片在线观看 | 欧美一级在线观看 | 在线色 | 日韩视频一区在线观看 | 99久久婷婷国产综合精品电影 | 九九热在线视频观看这里只有精品 | 一区二区精品视频 | 国产精品免费在线 | 在线观看日韩精品视频 | 亚洲欧美一区二区三区国产精品 | 精品成人在线视频 | 欧洲亚洲一区二区三区 | 国产精品视频一区二区三区四区国 | 精品国产三级 | 日韩欧美在线不卡 | 亚洲国产精品福利 | 国产 欧美 日韩 一区 | 羞视频在线观看 | 欧美一级免费看 | 久久久www成人免费无遮挡大片 | 精品一区二区三区91 | 一区二区三区视频在线观看 | 久久一区| 久久久久久亚洲精品 | 精品综合久久 | 国产麻豆乱码精品一区二区三区 | 国产精品美女久久久 |